I'm trying to create a desktop live-streaming app in C#. The program must run under Windows and stream image from user's desktop to rtmp. There must also be options of framerate, video size, quality and codec (h263 and h264). I think FFmpeg is the best choise for this (if it is not so, please write a comment). I've managed to do everything I mentioned above with ffmpeg.exe using console. So I wish to know, can I include FFmpeg library into C# project (as a .lib or .dll) to use it instead of .exe, saving suitable functionality for my task? I'll be very grateful for any examples.
P.S. Here are some examples of commands I use:
ffmpeg -f dshow -i video=UScreenCapture -vcodec h264 -pix_fmt yuv420p -s 320x240 -f flv rtmp://[my adr]/desc
ffmpeg -f dshow -i video=UScreenCapture -vcodec h264 -r 15 -t 0:1:00 -q 12 -pix_fmt yuv420p -s 320x240 -f flv rtmp://[my adr]/desc