Hi,
I have successfully built a DTE recorder using a CM4. This is a device for recording DV or HDV video via a firewire connection. For this I have used a PCIE Firewire card, a custom RPI kernel to enable Firewire and a fork of the dvgrab software. One of the things dvgrab allows is for the video output to be piped to media player so you can preview the video - albeit on a delay. I use FFPLAY for this as I can display timecode and record status as overlays on the preview. Apart from some software improvements this is now working very well and the recordings are perfectly fine.
However it got me thinking of a useful side project to convert the video from the firewire to hdmi. I have a lot of HDV cameras and very few have hdmi outputs built in. There are firewire to SDI converters but they are rare and expensive. So I thought that by using dvgrab and piping the output to FFPLAY in fullscreen mode I could then connect the hdmi out from the CM4 board and to a TV or capture device and the CM4 would then act as a HDV to HDMI converter. However when I tried this I realised the video is rendering at around 10 to 15fps tops. That's too many dropped frames to be useable. My thoughts are that the CM4 is just not quick enough to do real time video at 1080p in this way. But then again the captured video files do play back smoothly. I was considering using an RPI 5 since that has a PCIE connection now but even then it seems it might be too much. I know arm systems are not the best for video render but I wondered if perhaps there were any settings for dvgrab or FFPLAY, or indeed to push the video hardware a bit more to improve results. I wouldn't mind if this increases the delay since I'm planning to use this for playback. Any advice appreciated.
Otherwise, I'm loathed to do it, but I would need to use an x86 artitecture machine. Does anyone have any experience of the ASROCK N100DC-ITX, and whether it would be quick enough to handle this job?
Thanks in advance.
I have successfully built a DTE recorder using a CM4. This is a device for recording DV or HDV video via a firewire connection. For this I have used a PCIE Firewire card, a custom RPI kernel to enable Firewire and a fork of the dvgrab software. One of the things dvgrab allows is for the video output to be piped to media player so you can preview the video - albeit on a delay. I use FFPLAY for this as I can display timecode and record status as overlays on the preview. Apart from some software improvements this is now working very well and the recordings are perfectly fine.
However it got me thinking of a useful side project to convert the video from the firewire to hdmi. I have a lot of HDV cameras and very few have hdmi outputs built in. There are firewire to SDI converters but they are rare and expensive. So I thought that by using dvgrab and piping the output to FFPLAY in fullscreen mode I could then connect the hdmi out from the CM4 board and to a TV or capture device and the CM4 would then act as a HDV to HDMI converter. However when I tried this I realised the video is rendering at around 10 to 15fps tops. That's too many dropped frames to be useable. My thoughts are that the CM4 is just not quick enough to do real time video at 1080p in this way. But then again the captured video files do play back smoothly. I was considering using an RPI 5 since that has a PCIE connection now but even then it seems it might be too much. I know arm systems are not the best for video render but I wondered if perhaps there were any settings for dvgrab or FFPLAY, or indeed to push the video hardware a bit more to improve results. I wouldn't mind if this increases the delay since I'm planning to use this for playback. Any advice appreciated.
Otherwise, I'm loathed to do it, but I would need to use an x86 artitecture machine. Does anyone have any experience of the ASROCK N100DC-ITX, and whether it would be quick enough to handle this job?
Thanks in advance.
Statistics: Posted by asprinwizard — Tue Jan 16, 2024 10:07 pm