据此 什么技术最适合将 iPhone 摄像机数据实时流式传输到计算机? 可以从 iphone 摄像机获取压缩数据,但正如我在 AVFoundation 参考中所读到的,您只能获得未压缩的数据.p>
所以问题是:
1) 如何从 iPhone 的相机中获取压缩帧和音频?
2) 使用 ffmpeg 的 API 编码未压缩的帧对于实时流式传输是否足够快?
任何帮助将不胜感激.
谢谢.
你很可能已经知道......
<块引用>1) 如何从 iPhone 的相机中获取压缩帧和音频?
你不能这样做.AVFoundation API 从各个角度防止了这种情况.我什至尝试过命名管道和其他一些鬼鬼祟祟的 unix foo.没有这样的运气.您别无选择,只能将其写入文件.在您链接的帖子中,用户建议设置回调以传递编码帧.据我所知,这对于 H.264 流是不可能的.捕获委托将提供以特定像素格式编码的图像.进行编码的是 Movie Writers 和 AVAssetWriter.
<块引用>2) 使用 ffmpeg 的 API 编码未压缩的帧对于实时流式传输是否足够快?
是的.但是,您必须使用 libx264 才能进入 GPL 领域.这与应用商店不完全兼容.
出于效率原因,我建议使用 AVFoundation 和 AVAssetWriter.
According to this What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer? is possible to get compressed data from iphone camera, but as I've been reading in the AVFoundation reference you only get uncompressed data.
So the questions are:
1) How to get compressed frames and audio from iPhone's camera?
2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?
Any help will be really appreciated.
Thanks.
You most likely already know....
1) How to get compressed frames and audio from iPhone's camera?
You can not do this. The AVFoundation API has prevented this from every angle. I even tried named pipes, and some other sneaky unix foo. No such luck. You have no choice but to write it to file. In your linked post a user suggest setting up the callback to deliver encoded frames. As far as I am aware this is not possible for H.264 streams. The capture delegate will deliver images encoded in a specific pixel format. It is the Movie Writers and AVAssetWriter that do the encoding.
2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?
Yes it is. However, you will have to use libx264 which gets you into GPL territory. That is not exactly compatible with the app store.
I would suggest using AVFoundation and AVAssetWriter for efficiency reasons.
这篇关于将 iPhone 摄像头实时流式传输到媒体服务器的最佳方式是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!