Blog
DirectShow Filters
Date: 18/10/2005
I'm trying to write a DirectShow app that previews video/audio from my TV tuner card. So far so good, the video is working fine and I can change channels and fine tune. But I'm kinda stumped as to why the audio isn't working. This is what my filter graph looks like:



The TV card has an analogue out which has a cable running around to the sound card's Line-In. I've got some code that runs through the pins on the Sound Card's capture filter, finds the "Line In" pin and enables it and sets the volume to "1.0". Maybe the next step is playing with the crossbar to get the audio routed to the right place.

Update: Well when it rains it pours. I followed my nose and starting playing with the crossbar interface. I decided to route the "Audio Tuner In" pin to the "Audio Decoder Out" pin. Seems logical enough. And whoooo sound! But now the problem is that I have 2 copies of the audio, slightly offset from each other, creating an echo. Interestingly as soon as the graph starts running, 1 lot of sound is heard, but when the message loop starts the 2nd lot is added. Then when the application quits the 1st lot of sound remains. Interesting.

Update2: This is what the final working graph looks like. Which is really just the bottom section of the first graph above:

Comments:
SnappyCrunch
18/10/2005 1:16pm
I wish I knew something about DirectShow coding. That app sounds interesting.
fret
24/10/2005 9:35am
Ok, I've worked out the second copy of the audio is comming from the extra audio device I added to the graph. I originally thought that I would have to add the video source device and the audio source device to the graph to get it to work. But this is actually incorrect, as the graph builder seems to know what to do to get audio working anyway.

All I had to do was get the right routing happening inside the crossbar, which in my case is "Audio Tuner In" -> "Audio Decoder Out" and it seems to know that it has to get the audio from the SBLive's line-in anyway.
fret
24/10/2005 10:31am
Interestingly, I've been playing with the video window size to see what effect it has on the CPU usage. Originally I picked the same size window as the TV app that came with the card and it was giving me 50% cpu usage. Not too hot for just watching TV. So then I thought that I would try and get the source size and set the output window to that. The source size according to the IBasicVideo window was 320x240 despite the card being in PAL mode, and when I made the output window the exact same size, CPU usage dropped to.... wait for it... 0%. *blink* *blink* Yeah, I know... and every now and then it'd jump to 10 or 20% before dropping back to 0%. So it obviously mades a huge difference how you run things. I would expect that there is a way of getting the video output to be scaled in hardware but I don't have the foggiest idea on how to make that happen so 320x240 it is for me.

One of my main aims is to explore the performance tradeoffs of the media system for previewing and capturing video as my machine is kinda low spec by current day standards and any possibility of an upgrade is receeding into the future.
 
Reply
From:
Email (optional): (Will be HTML encoded to evade harvesting)
Message:
 
Remember username and/or email in a cookie.
Notify me of new posts in this thread via email.
BBcode:
[q]text[/q]
[url=link]description[/url]
[img]url_to_image[/img]
[pre]some_code[/pre]
[b]bold_text[/b]