{"_id":"5887d259a8fe111900092592","category":{"_id":"55e75b39e06f4b190080dbfe","project":"55d535ca988e130d000b3f5c","__v":10,"pages":["56959043fe18811700c9c09e","569590bfcb14e11700f8a877","569590f7fcb1032d0089e033","5695917dfcb1032d0089e035","5695964a77ba0d2300cf3912","5695967edcaf0d1700cb8752","569618eccb14e11700f8a910","56961d937596a90d0014e571","5696ba13480534370022a37a","56dd002ee5c8570e00a79865"],"version":"55d535cb988e130d000b3f5f","sync":{"url":"","isSync":false},"reference":false,"createdAt":"2015-09-02T20:25:29.622Z","from_sync":false,"order":3,"slug":"frame-for-business","title":"Frame for Business"},"project":"55d535ca988e130d000b3f5c","version":{"_id":"55d535cb988e130d000b3f5f","__v":12,"project":"55d535ca988e130d000b3f5c","hasDoc":true,"hasReference":false,"createdAt":"2015-08-20T02:04:59.052Z","releaseDate":"2015-08-20T02:04:59.052Z","categories":["55d535cc988e130d000b3f60","55d6b238d2a8eb1900109eef","55d6b4f3250d7d0d004274cd","55d7967960fc730d00fc2852","55da9804e835f20d009fc5d0","55e75b1de06f4b190080dbfd","55e75b39e06f4b190080dbfe","55e75b7ae06f4b190080dbff","564f5a4e33082f0d001bb709","570fb64aa38d470e0060cbff","586d0dd89a854123001acd65","586d0e3b9a854123001acd66"],"is_deprecated":false,"is_hidden":false,"is_beta":false,"is_stable":true,"codename":"","version_clean":"1.0.0","version":"1.0"},"__v":0,"user":"587e9209db3b0319007af1fc","parentDoc":null,"updates":[],"next":{"pages":[],"description":""},"createdAt":"2017-01-24T22:16:57.770Z","link_external":false,"link_url":"","githubsync":"","sync_unique":"","hidden":false,"api":{"results":{"codes":[]},"settings":"","auth":"required","params":[],"url":""},"isReference":false,"order":16,"body":"[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Overview\"\n}\n[/block]\nFrame runs your applications on powerful servers in the cloud and streams screen updates down to your browser using a H.264-based video streaming protocol. H.264 is a proven and flexible technology used in everything from HDTV broadcast and Digital Cinema applications, Blu-ray players and digital video recorders, through to CCTV and video surveillance systems. It’s the same protocol that Netflix, YouTube, Apple and others use to stream movies and TV shows from their data centers to your TVs, PCs, and phones. So, it’s well suited to delivering services across long distances.\n\nH.264 offers a lot of flexibility over the ways it can process images. By default, Frame’s implementation is configured to provide a pragmatic balance between image quality and bandwidth that works well for delivering both rapidly changing video content (e.g., teleconferencing services and computer gaming) and high-resolution graphics apps like CAD packages. While this default setting works well for most applications and in most situations, there are some circumstances where manual tuning of the protocol’s Quality of Service (QoS) characteristics can improve user experience. The Frame Terminal has an optional advanced control panel, where QoS settings can be individually adjusted by users to optimize their experience. QoS settings can be used either to improve overall display quality or to prioritize one performance characteristic ahead of the others when bandwidth is limited.\f\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/ede79f2-Generic_Network_Diagram_-_Page_1_2.png\",\n        \"Generic Network Diagram - Page 1 (2).png\",\n        1631,\n        680,\n        \"#c6e4cc\"\n      ]\n    }\n  ]\n}\n[/block]\n\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Accessing the QoS Settings\"\n}\n[/block]\nIf enabled by your system admin, the QoS settings page is accessible from the 'Settings' option on the terminal’s 'Gear' menu. \n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/d87d254-QOS_Menu.PNG\",\n        \"QOS Menu.PNG\",\n        265,\n        389,\n        \"#262727\"\n      ]\n    }\n  ]\n}\n[/block]\n Frame Admins Note: The QoS Settings menu is suppressed by default. Frame Platform Ultimate customers can enable protocol QoS support immediately. Other customers wanting to enable access to the QoS settings should email support:::at:::fra.me, and we will take care of it for you.\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"QoS Settings Control Panel\"\n}\n[/block]\n\fSelecting Settings from the menu brings up the QoS Settings Control Panel:\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/dfb9daf-QOS_Settings.PNG\",\n        \"QOS Settings.PNG\",\n        612,\n        707,\n        \"#e6edee\"\n      ]\n    }\n  ]\n}\n[/block]\nChanges to QoS settings persist for the duration of a user’s session and revert back to the default configuration whenever a session is terminated or disconnected for any reason. If for any reason changes to QoS settings cause significant usability issues, you can simply disconnect the session and reconnect to restore the default settings.\n\nNote: When changes are applied, the terminal connection may be interrupted briefly, but will reconnect automatically\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Streaming settings explained\"\n}\n[/block]\n## Video encoding presets\nFrame supports multiple custom presets presets optimised for specific use cases. Both variable bit rate (VBR) and constant bit rate (CBR) presets are available. Variable bit rate presets are better for displaying mix of static and dynamic content (e.g., static display of a 3D model combined with rotating/translating parts); while constant bit rate presets are better for displaying dynamic content (movie-like experiences, video conferencing, video games, etc.). There is also a low bit rate preset which minimizes bandwidth used, useful for network connections with very limited available bandwidth.\n\n**Available Video Presets**\n\n  * Auto: Default video preset\n  * vbrlow: Variable Bit Rate - Low Quality \n  * vbrhigh: Variable Bit Rate - High Quality\n  * cbrlow: Constant Bit Rate - Low Quality\n  * cbrhigh: Constant Bit Rate - High Quality\n  * lowbitrate: \n\n##Max frame rate \nFrame automatically adjusts the video frame rate in response to application activity and available bandwidth. Under normal circumstances, the maximum frame rate is 30 frame per second (fps). Limiting the maximum frame rate can reduce bandwidth requirements, but may cause choppiness and can make interactive editing tasks difficult.\n\nSupported range: 5 - 30 fps for GPU enabled instances \n\t\t     5 - 15 fps for Frame Air instances \n\n##Max video bit rate\nFrame limits the maximum video stream bit rate to 16,000 kbps. Lowering the bit rate limits the overall bandwidth available to Frame, reducing both frame rate and image quality.\n\nSupported range: 256 kbps  - 16,000 kbps\n\n**Limitations**\nSetting maximum video bit rate to a value lower than 6,000 kbps will also affect maximum frame rate and maximum audio bit rate. For example, if maximum video bit rate is set to 2,000 kbps, frame rate will be automatically limited to 14 fps, and audio bit rate will be limited to 128 kbps. \n\n##Max audio bit rate: \nBy limiting the maximum audio bit rate, it is possible to reduce the bandwidth available for audio independently of any settings used by the audio source to reduce overall bandwidth requirements.  \n\nSupported range: 0 - 160 kbps. 0 disables audio channel.\n\n##Scale video \nChanging the video scale resizes the virtual desktop/app session to reduce the overall bandwidth requirements. Scale Video can reduce the session size by as much as 50%. Setting Video Scale to 0.5 (50%) would scale down a full screen session running at 1024×768 to 512x384. At this resolution, the resized display occupies one quarter of the amount of space of the original and requires approximately one quarter of the original bandwidth to transmit. When the Frame terminal receives the image, it is then rendered at its original size but at a lower resolution. You get a similar effect if you try watching a YouTube video with video quality set to 360P. While the resultant image is blurred, it may still be acceptable, depending on the content being viewed. Work requiring high image fidelity to such as editing documents and spreadsheets is unlikely to benefit from this approach, but it may be suitable when participating in a video-conference where the highest video quality is not required.\n\nSupported range: 0.5 - 1 (50% - 100% of original session size)\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/2c837fd-Scale_Video.PNG\",\n        \"Scale Video.PNG\",\n        651,\n        293,\n        \"#cc89a7\"\n      ]\n    }\n  ]\n}\n[/block]\n##Max video quantization\nThe video quantization setting controls the way that the Frame protocol encodes the video stream. Quantization can be thought of as a measure of degree to which a video stream can be compressed for a given image quality. Quantization is determined dynamically by the H.264 encoder based on both the available bandwidth and the display content.  How this is done is beyond the scope of this article, but the output is readily understandable. Complex content such as a high resolution CAD drawing, is assigned a low quantization factor and only lightly compressed to maintain quality, while simple ‘low information’ content will receive a higher quantization value and so be compressed to a greater degree. \n\nThe Max video quantization control sets the amount of compression that the encoder must use across a range from 48 (heavy compression, lower image quality) to 24 (light compression, better image quality), with a default setting of 42. Increasing the quantization factor from 42 towards 48 will force the encoder to compress all content more aggressively, reducing the amount of bandwidth required but with a risk of increasing visible compression artifacts in complex images. Decreasing the quantization factor towards 24 will permit the encoder to use less compression and so achieve higher image quality, but only if there is bandwidth available to transmit the video stream. Setting a low quantization factor will not improve image quality unless there is sufficient bandwidth available for the encoder to take advantage of it.\n\nNote: Setting quantization to a low value (24-28) may make network latency effects more pronounced.\n\n##Best video quality\nFrame’s H.264 implementation uses YUV 4:2:0 chroma subsampling to encode images. This takes advantage of the human eye’s inability to recognize color differences to the same degree that it can recognize variations in brightness. By sending less information about color than it does about brightness, it is possible to reduce the amount of bandwidth required substantially without significantly compromising image quality. This does an excellent job of reducing the amount of bandwidth required, but in some situations, especially in apps where regions of strongly contrasting colors are displayed next to each other, chroma subsampling can result in colors “bleeding” into each other with undesirable results. \n\nTo support our customers who need absolute color fidelity, Frame also provides support for YUV 4:4:4 encoding. This turns off chroma subsampling sending the full depth of color information for every pixel. As it sends more color information, there is a corresponding increase in required bandwidth. YUV 4:4:4 encoding is enabled by selecting “Best video quality”. The Frame Terminal session sizing icon (Magnifying Glass Icon) changes color to indicate when \"Best video quality\" is in use as follows:\n\nGreen: The session is in Best Video Quality mode\nOrange: Best Video Quality (YUV 4:4:4) mode has been selected but the session is unable to provide it due to bandwidth limitations\nWhite: Best Video Quality mode has not been selected (YUV 4:2:0) \n\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/7f701a2-Image_Quality_Indicator.png\",\n        \"Image Quality Indicator.png\",\n        700,\n        148,\n        \"#1f3046\"\n      ]\n    }\n  ]\n}\n[/block]\nNOTE: This Best video quality is available only when using Chrome browser.\n\n##Grayscale\nWith Greyscale enabled, color information is not transmitted (this is YUV 4:0:0 encoding), enabling a substantial reduction in bandwidth.\n\nAs Greyscale does not send any color information and “Best video quality” sends full color information, these two setting are mutually exclusive.\n\n\n##Recommendations\n###Unconstrained Networks\nIn environments where network bandwidth is effectively unconstrained, it is possible to adjust Frame QoS settings to enhance user experience in situations where improved image quality can be advantageous.\n\n**Picture editing/proofing**\nEnable “Best Video Quality”’ to ensure highest color fidelity.\nDecrease “‘Video Quantization” to minimize image artifacts.\n\n**Spreadsheets, CAD packages, and similar applications**\n\nEnable “Best Video Quality”’ to prevent 'color bleed'. \nDecrease “‘Video Quantization” to minimize to minimize image artifacts.\n\n###Bandwidth Constrained Networks\n\n **Gaming**\nPrioritize FPS for best overall experience\nConsider reducing ‘Video Scale’ or increasing ‘Max Quantization’ to reduce bandwidth requirements to allow an increase in the number of sessions.\n\n**Videoconferencing and webinars**\nBandwidth requirements for VoIP and audio/video conferencing services vary greatly. Codec choice, audio quality, display resolution and aspect ratio, and the complexity of the images activity levels all play a significant role in determining bandwidth requirements. Individual settings can usually be adjusted reduce overall bandwidth requirements. However, not all system offer sufficient control to develop acceptable performance in low bandwidth settings. Further reduction in bandwidth requirements can be made by adjusting the Frame QOS if necessary.\n\nConsider reducing “Video Scale” as a simple means of reducing the overall bandwidth required without the need to adjust more advanced QoS options. \n\nAdditional adjustments to reduce video bandwidth requirements include:\nIncreasing \"Video Quantization\" which will reduce bandwidth requirements at a cost of increased 'blockiness' in the video stream. Reducing “Max Video bit rate” which can reduce bandwidth requirements at the expense of overall video quality and frame rate. By reducing “Max Video bit rate” it is possible to limit the overall bandwidth consumed by a Frame session and so ensure additional capacity is reserved for other purposes.","excerpt":"","slug":"frame-terminal-quality-of-service-qos-settings","type":"basic","title":"Frame Terminal Quality of Service (QoS) Settings"}

Frame Terminal Quality of Service (QoS) Settings


[block:api-header] { "type": "basic", "title": "Overview" } [/block] Frame runs your applications on powerful servers in the cloud and streams screen updates down to your browser using a H.264-based video streaming protocol. H.264 is a proven and flexible technology used in everything from HDTV broadcast and Digital Cinema applications, Blu-ray players and digital video recorders, through to CCTV and video surveillance systems. It’s the same protocol that Netflix, YouTube, Apple and others use to stream movies and TV shows from their data centers to your TVs, PCs, and phones. So, it’s well suited to delivering services across long distances. H.264 offers a lot of flexibility over the ways it can process images. By default, Frame’s implementation is configured to provide a pragmatic balance between image quality and bandwidth that works well for delivering both rapidly changing video content (e.g., teleconferencing services and computer gaming) and high-resolution graphics apps like CAD packages. While this default setting works well for most applications and in most situations, there are some circumstances where manual tuning of the protocol’s Quality of Service (QoS) characteristics can improve user experience. The Frame Terminal has an optional advanced control panel, where QoS settings can be individually adjusted by users to optimize their experience. QoS settings can be used either to improve overall display quality or to prioritize one performance characteristic ahead of the others when bandwidth is limited. [block:image] { "images": [ { "image": [ "https://files.readme.io/ede79f2-Generic_Network_Diagram_-_Page_1_2.png", "Generic Network Diagram - Page 1 (2).png", 1631, 680, "#c6e4cc" ] } ] } [/block] [block:api-header] { "type": "basic", "title": "Accessing the QoS Settings" } [/block] If enabled by your system admin, the QoS settings page is accessible from the 'Settings' option on the terminal’s 'Gear' menu. [block:image] { "images": [ { "image": [ "https://files.readme.io/d87d254-QOS_Menu.PNG", "QOS Menu.PNG", 265, 389, "#262727" ] } ] } [/block] Frame Admins Note: The QoS Settings menu is suppressed by default. Frame Platform Ultimate customers can enable protocol QoS support immediately. Other customers wanting to enable access to the QoS settings should email support@fra.me, and we will take care of it for you. [block:api-header] { "type": "basic", "title": "QoS Settings Control Panel" } [/block] Selecting Settings from the menu brings up the QoS Settings Control Panel: [block:image] { "images": [ { "image": [ "https://files.readme.io/dfb9daf-QOS_Settings.PNG", "QOS Settings.PNG", 612, 707, "#e6edee" ] } ] } [/block] Changes to QoS settings persist for the duration of a user’s session and revert back to the default configuration whenever a session is terminated or disconnected for any reason. If for any reason changes to QoS settings cause significant usability issues, you can simply disconnect the session and reconnect to restore the default settings. Note: When changes are applied, the terminal connection may be interrupted briefly, but will reconnect automatically [block:api-header] { "type": "basic", "title": "Streaming settings explained" } [/block] ## Video encoding presets Frame supports multiple custom presets presets optimised for specific use cases. Both variable bit rate (VBR) and constant bit rate (CBR) presets are available. Variable bit rate presets are better for displaying mix of static and dynamic content (e.g., static display of a 3D model combined with rotating/translating parts); while constant bit rate presets are better for displaying dynamic content (movie-like experiences, video conferencing, video games, etc.). There is also a low bit rate preset which minimizes bandwidth used, useful for network connections with very limited available bandwidth. **Available Video Presets** * Auto: Default video preset * vbrlow: Variable Bit Rate - Low Quality * vbrhigh: Variable Bit Rate - High Quality * cbrlow: Constant Bit Rate - Low Quality * cbrhigh: Constant Bit Rate - High Quality * lowbitrate: ##Max frame rate Frame automatically adjusts the video frame rate in response to application activity and available bandwidth. Under normal circumstances, the maximum frame rate is 30 frame per second (fps). Limiting the maximum frame rate can reduce bandwidth requirements, but may cause choppiness and can make interactive editing tasks difficult. Supported range: 5 - 30 fps for GPU enabled instances 5 - 15 fps for Frame Air instances ##Max video bit rate Frame limits the maximum video stream bit rate to 16,000 kbps. Lowering the bit rate limits the overall bandwidth available to Frame, reducing both frame rate and image quality. Supported range: 256 kbps - 16,000 kbps **Limitations** Setting maximum video bit rate to a value lower than 6,000 kbps will also affect maximum frame rate and maximum audio bit rate. For example, if maximum video bit rate is set to 2,000 kbps, frame rate will be automatically limited to 14 fps, and audio bit rate will be limited to 128 kbps. ##Max audio bit rate: By limiting the maximum audio bit rate, it is possible to reduce the bandwidth available for audio independently of any settings used by the audio source to reduce overall bandwidth requirements. Supported range: 0 - 160 kbps. 0 disables audio channel. ##Scale video Changing the video scale resizes the virtual desktop/app session to reduce the overall bandwidth requirements. Scale Video can reduce the session size by as much as 50%. Setting Video Scale to 0.5 (50%) would scale down a full screen session running at 1024×768 to 512x384. At this resolution, the resized display occupies one quarter of the amount of space of the original and requires approximately one quarter of the original bandwidth to transmit. When the Frame terminal receives the image, it is then rendered at its original size but at a lower resolution. You get a similar effect if you try watching a YouTube video with video quality set to 360P. While the resultant image is blurred, it may still be acceptable, depending on the content being viewed. Work requiring high image fidelity to such as editing documents and spreadsheets is unlikely to benefit from this approach, but it may be suitable when participating in a video-conference where the highest video quality is not required. Supported range: 0.5 - 1 (50% - 100% of original session size) [block:image] { "images": [ { "image": [ "https://files.readme.io/2c837fd-Scale_Video.PNG", "Scale Video.PNG", 651, 293, "#cc89a7" ] } ] } [/block] ##Max video quantization The video quantization setting controls the way that the Frame protocol encodes the video stream. Quantization can be thought of as a measure of degree to which a video stream can be compressed for a given image quality. Quantization is determined dynamically by the H.264 encoder based on both the available bandwidth and the display content. How this is done is beyond the scope of this article, but the output is readily understandable. Complex content such as a high resolution CAD drawing, is assigned a low quantization factor and only lightly compressed to maintain quality, while simple ‘low information’ content will receive a higher quantization value and so be compressed to a greater degree. The Max video quantization control sets the amount of compression that the encoder must use across a range from 48 (heavy compression, lower image quality) to 24 (light compression, better image quality), with a default setting of 42. Increasing the quantization factor from 42 towards 48 will force the encoder to compress all content more aggressively, reducing the amount of bandwidth required but with a risk of increasing visible compression artifacts in complex images. Decreasing the quantization factor towards 24 will permit the encoder to use less compression and so achieve higher image quality, but only if there is bandwidth available to transmit the video stream. Setting a low quantization factor will not improve image quality unless there is sufficient bandwidth available for the encoder to take advantage of it. Note: Setting quantization to a low value (24-28) may make network latency effects more pronounced. ##Best video quality Frame’s H.264 implementation uses YUV 4:2:0 chroma subsampling to encode images. This takes advantage of the human eye’s inability to recognize color differences to the same degree that it can recognize variations in brightness. By sending less information about color than it does about brightness, it is possible to reduce the amount of bandwidth required substantially without significantly compromising image quality. This does an excellent job of reducing the amount of bandwidth required, but in some situations, especially in apps where regions of strongly contrasting colors are displayed next to each other, chroma subsampling can result in colors “bleeding” into each other with undesirable results. To support our customers who need absolute color fidelity, Frame also provides support for YUV 4:4:4 encoding. This turns off chroma subsampling sending the full depth of color information for every pixel. As it sends more color information, there is a corresponding increase in required bandwidth. YUV 4:4:4 encoding is enabled by selecting “Best video quality”. The Frame Terminal session sizing icon (Magnifying Glass Icon) changes color to indicate when "Best video quality" is in use as follows: Green: The session is in Best Video Quality mode Orange: Best Video Quality (YUV 4:4:4) mode has been selected but the session is unable to provide it due to bandwidth limitations White: Best Video Quality mode has not been selected (YUV 4:2:0) [block:image] { "images": [ { "image": [ "https://files.readme.io/7f701a2-Image_Quality_Indicator.png", "Image Quality Indicator.png", 700, 148, "#1f3046" ] } ] } [/block] NOTE: This Best video quality is available only when using Chrome browser. ##Grayscale With Greyscale enabled, color information is not transmitted (this is YUV 4:0:0 encoding), enabling a substantial reduction in bandwidth. As Greyscale does not send any color information and “Best video quality” sends full color information, these two setting are mutually exclusive. ##Recommendations ###Unconstrained Networks In environments where network bandwidth is effectively unconstrained, it is possible to adjust Frame QoS settings to enhance user experience in situations where improved image quality can be advantageous. **Picture editing/proofing** Enable “Best Video Quality”’ to ensure highest color fidelity. Decrease “‘Video Quantization” to minimize image artifacts. **Spreadsheets, CAD packages, and similar applications** Enable “Best Video Quality”’ to prevent 'color bleed'. Decrease “‘Video Quantization” to minimize to minimize image artifacts. ###Bandwidth Constrained Networks **Gaming** Prioritize FPS for best overall experience Consider reducing ‘Video Scale’ or increasing ‘Max Quantization’ to reduce bandwidth requirements to allow an increase in the number of sessions. **Videoconferencing and webinars** Bandwidth requirements for VoIP and audio/video conferencing services vary greatly. Codec choice, audio quality, display resolution and aspect ratio, and the complexity of the images activity levels all play a significant role in determining bandwidth requirements. Individual settings can usually be adjusted reduce overall bandwidth requirements. However, not all system offer sufficient control to develop acceptable performance in low bandwidth settings. Further reduction in bandwidth requirements can be made by adjusting the Frame QOS if necessary. Consider reducing “Video Scale” as a simple means of reducing the overall bandwidth required without the need to adjust more advanced QoS options. Additional adjustments to reduce video bandwidth requirements include: Increasing "Video Quantization" which will reduce bandwidth requirements at a cost of increased 'blockiness' in the video stream. Reducing “Max Video bit rate” which can reduce bandwidth requirements at the expense of overall video quality and frame rate. By reducing “Max Video bit rate” it is possible to limit the overall bandwidth consumed by a Frame session and so ensure additional capacity is reserved for other purposes.