I’ve been playing around with live streaming from ffmpeg recently, and my latest adventure was to try adding a time stamp to the feed. I searched Google for a solution, but couldn’t find a complete howto, so this is pieced together from information I found found all over the net.
Turns out, all the information you really need to get this working is already in the libavfilter documentation, I just didn’t read it carefully enough.
First of all you need to have a recent build of ffmpeg, with the --enable-libfreetype
flag enabled. Just use this excellent howto, and add the flag yourself in the configure-step. You must also make sure to have the libfreetype-dev package installed. This is all provided you use a Debian or Ubuntu based distro, of course.
Once you have built ffmpeg you can check if you have the necessary filter installed with this command:
ffmpeg -filters | grep drawtext
That should print out the following line:
drawtext Draw text on top of video frames using libfreetype library.
Now you should be able to do something like this:
ffmpeg -f video4linux2 -i /dev/video0 -s 640x480 -r 30 -vf \ "drawtext=fontfile=/usr/share/fonts/truetype/ttf-dejavu/DejaVuSans-Bold.ttf: \ text='\%T': fontcolor=white@0.8: x=7: y=460" -vcodec libx264 -vb 2000k \ -preset ultrafast -f mp4 output.mp4
In short, this sets up capture from v4l2-device /dev/video0 with a framesize of 640×480 in 30 fps (pretty common for older webcams). The -vf is where the filter gets applied. fontfile
gives the path to a TTF font, text contains the text we want, in this case we want to expand a strftime()
sequence (see man strftime
for a full list of parameters). Note the escaping slash in front of the %. Then we set the font color to white, with a 80% opacity. There are many other options, such as fontsize, but I haven’t tried them.
Hope this is useful for someone out there.
Addendum, February 21, 2016
Turns out, this blog post has been dug up by people from time to time, judging from the number of pingbacks it has accumulated over the years. So in case you’re here now: ffmpeg made some changes to how they do text expansion (go figure), and the link to the documentation changed too. Here’s a revised example, using a more modern camera with mjpeg for good measure:
ffmpeg -f video4linux2 -input_format mjpeg -s 1280x720 -i /dev/video0 \ -vf "drawtext=fontfile=/usr/share/fonts/dejavu/DejaVuSans-Bold.ttf: \ text='%{localtime\:%T}': fontcolor=white@0.8: x=7: y=700" -vcodec libx264 \ -preset veryfast -f mp4 -pix_fmt yuv420p -y output.mp4
Oddly, in windows some of the strftime parameters don’t seem to expand right (and if even one doesn’t work, it outputs *nothing* )
This did work though: -vf «drawtext=fontfile=subfont.ttf:text=’%m/%d/%y %H\:%M\:%S'»
Thank you so much.
localtime ffmpeg newest standard
-vf drawtext=»fontfile=arial.ttf:fontsize=14:fontcolor=white:box=1:boxcolor=black@0.9:x=08:y=466:text=’%%{localtime\: %%m/%%d/%%Y %%I.%%M.%%S %%p}'»
strftime ffmpeg deprecated standard
-vf drawtext=»expansion=strftime:fontfile=arial.ttf:fontsize=14:fontcolor=white:box=1:boxcolor=black@0.9:x=08:y=466:text=’%%m\/%%d\/%%Y %%I\:%%M\:%%S \%%p'»
Hi,
nice job.
Was looking for something like this for hours and hours.
My final was https://vimeo.com/160205640
(somebody) did the pictures for latex as template.
with a bash script I created for each minute of the day a single picture.
with an other bash script I read out the timestamp of the picture and overlayed the matching clockface. Then I put them all together to a movie…
That was a pretty neat idea!
Really nice job.
I want to ask you, how can do if I wanna print also the millisecond?
Maybe this post is too old but i just found it
Heres My answer on windows
-vf «yadif,setdar=4/3,drawtext=fontfile=/Windows/Fonts/arial.ttf:fontsize=45:fontcolor=white@0.8:box=1:boxcolor=gray@0.9:x=70: y=20:text=’%%{localtime\: %%m/%%d/%%Y %I.%%M.%%S %%p}'»