I have been conducting some tests and am totally lost as to why flash based broadcasting drops frames.
The genral formula for calculating bitrate is :
bitrate = Width x Height x FPS
Now if this value is lower than Camera.bandwidth then why should flash still drop frames ?? Any assistance/ideas is greatly appreciated.
I have checked with FMS performance tab: bytes in is well bellow Camera.bandwidth as well , but still flash drops frames… why??
probably because the Internet was never intended to stream video or audio or anything other than text for that matter but extremely creative people found a way.
Your answer has been reported as a abuse to adobe forums admin. I find it a pity that there is such sarcasm on a MNC’s forum page. If you cannot answer please forward to someone who can. Do not waste other’s time and money.
For seriuos forum members i am adding some more information that may help understand my issue. The moment i try to set mode above 160 x 120 i start getting frame drops in swf client. Even if i decrease quality to 1 it dosent matter. If i increase buffer to 5 seconds it dosent matter. And to addup its just on localhost, not even online.
*I have a Nvidia 9500GT graphics card on a intel i5 core running windows 7 professional.
Ummmm I was very downright serious. Do you know much about networking? Go read up on the history of networking. I’m sorry you are offended. Its ok that you reported me. Adobe may contact me they may not. Time will tell. I hope an Adobe engineer or employee answers this question. I would like to here the less philisophical answer to your question. If you don’t want to wait for that maybe looking up the networking protocols TCP and UDP will answer your questions especially the UDP one.
oh! you dont have to be an engeer to answer that. I reported you because i saw you had 306 points. Which means you had to be a serious guy giving a serious answer. If you had read my question (you may want to read it again just in case).. here is the answer to it that i could find on my own. There is no need to get things complicated, god created things simple eniough
You will see that when we apply setMode funtion to request a width, height and fps, flash will request the device for closest native settings of the camera. I am pasting it here so everyone can read:
By default, Flash Player drops frames as needed to maintain image size. To minimize the number of dropped frames, even if this means reducing the size of the image, pass
false for the
When choosing a native mode, Flash Player tries to maintain the requested aspect ratio whenever possible. For example, if you issue the command
myCam.setMode(400, 400, 30), and the maximum width and height values available on the camera are 320 and 288, Flash Player sets both the width and height at 288; by setting these properties to the same value, Flash Player maintains the 1:1 aspect ratio you requested.
So as you see its a simple thing, nothing so complicated as NETWORKING, ALOHA , TCP/IP etc etc . If you wil notice all the situations i have stated, it clearly implies frame drop happens regardless of any bandwidth settings as soon as i go over a certain resolution.
Haver a nice day! Cheers
I’m glad you found the answer you were looking for. Goodbye.
- Why does a Flash movie whose frames were just cut & pasted on a new frame act funny when tested?
- Stopping the Player after X frames
- Change timeline to frames?
- getting white frames when importing video to CS4 Photoshop
- AE5 Video Export Dropped Frames
Related posts brought to you by Yet Another Related Posts Plugin.