server push
Recently Published Documents


TOTAL DOCUMENTS

32
(FIVE YEARS 4)

H-INDEX

6
(FIVE YEARS 0)

Author(s):  
Yuto Ito ◽  
Yoshifumi Manabe
Keyword(s):  


Author(s):  
Abhishek Rastogi ◽  
◽  
Shashank Vats ◽  
Shivam Pundir ◽  
Ramender Singh ◽  
...  

Webpages have become increasingly complex in recent years, with longer loading times to match. This paper uses tailored edge computing to address this issue. As is customary, A grip server interacts with cloud web servers in edge computing. In A footing server, on the other hand, is a personalised edge computing system. referred to as a foothold The Server in the Middle (ESM) collaborates with other servers. users' cell phonesThis research focuses on two strategies based on personalised edge computing: edge aided caching and edge aided reprioritizing. Edge-assisted caching decreases the time it takes for a page to load. Because an ESM saves the cached data on mobile devices, So far, we've got components. Edge helps in the reprioritization of forces on the internet. browser to show visual components earlier and lowers the amount of white space Time spent in front of a screen.In addition, the ESM uses HTTP/2 rather than HTTP/1.1. This decreases the number of interactions between a mobile device and, as a result, the ESM, allowing advanced functionalities to be used. such as priority and server push Edge-assisted caching has been implemented. built in a high-end PC for Google's web browser Chrome for Android is a mobile web browser. Edge aided in an experiment, according to the results. The time it took for a popular website to load was cut in half because to caching. 59 percent in a network that is extremely congested. Another experiment found that edge-assisted reprioritization cut the white screen time of a webpage with a lot of photo photos by 21%. edge computing, reprioritization, mobile device, index terms browsing the web, caching





2021 ◽  
pp. 193-210
Author(s):  
Alejandro Duarte
Keyword(s):  


2020 ◽  
Vol 10 (7) ◽  
pp. 2485
Author(s):  
Chanh Minh Tran ◽  
Tho Nguyen Duc ◽  
Phan Xuan Tan ◽  
Eiji Kamioka

HTTP/2 video streaming has gotten a lot of attention in the development of multimedia technologies over the last few years. In HTTP/2, the server push mechanism allows the server to deliver more video segments to the client within a single request in order to deal with the requests explosion problem. As a result, recent research efforts have been focusing on utilizing such a feature to enhance the streaming experience while reducing the request-related overhead. However, current works only optimize the performance of a single client, without necessary concerns of possible influences on other clients in the same network. When multiple streaming clients compete for a shared bandwidth in HTTP/1.1, they are likely to suffer from unfairness, which is defined as the inequality in their bitrate selections. For HTTP/1.1, existing works have proven that the network-assisted solutions are effective in solving the unfairness problem. However, the feasibility of utilizing such an approach for the HTTP/2 server push has not been investigated. Therefore, in this paper, a novel proxy-based framework is proposed to overcome the unfairness problem in adaptive streaming over HTTP/2 with the server push. Experimental results confirm the outperformance of the proposed framework in ensuring the fairness, assisting the clients with avoiding rebuffering events and lowering bitrate degradation amplitude, while maintaining the mechanism of the server push feature.



Author(s):  
Chanh Minh Tran ◽  
Tho Nguyen Duc ◽  
Phan Xuan Tan ◽  
Eiji Kamioka

HTTP/2 video streaming has caught a lot of attentions in the development of multimedia technologies over the last few years. In HTTP/2, the server push mechanism allows the server to deliver more video segments to the client within a single request in order to deal with the requests explosion problem. As a result, recent research efforts have been focusing on utilizing such a feature to enhance the streaming experience while reducing the request-related overhead. However, current works only optimize the performance of a single client, without necessary concerns of possible influences on other clients in the same network. When multiple streaming clients compete for a shared bandwidth in HTTP/1.1, they are likely to suffer from unfairness, which is defined as the inequality in their bitrate selections. For HTTP/1.1, existing works have proven that the network-assisted solutions are effective in solving the unfairness problem. However, the feasibility of utilizing such an approach for the HTTP/2 server push has not been investigated. Therefore, in this paper, a novel proxy-based framework is proposed to overcome the unfairness problem in adaptive streaming over HTTP/2 with the server push. Experimental results confirm the outperformance of the proposed framework in ensuring the fairness, assisting the clients to avoid rebuffering events and lower bitrate degradation amplitude, while maintaining the mechanism of the server push feature.



Author(s):  
Yu Wang ◽  
Chanh Minh Tran ◽  
Tho Nguyen Duc ◽  
Xiaochun Wu ◽  
Phan Xuan Tan ◽  
...  


Author(s):  
Zahra Al-Awadai ◽  
Anne Brüggemann-Klein ◽  
Christina Grubmüller ◽  
Philipp Ulrich

“XML Everywhere” isn't just a slogan: it actually works, up and down the XML application stack. Recent developments, such as the inclusion of custom elements in HTML5, allow the declarative approach of XML to come into the browser/server interaction. XForms, supported by SVG and CSS, can serve as the basis for a graphical user interface. A custom WebSocket element can support client-to-client and server-push communication of XML data. Applications of State Chart XML (SCXML) mean that the “XML Everywhere” approach can be extended all the way to models of operations in an application. Interactive games offer living proof of the stack.



Author(s):  
Torsten Zimmermann ◽  
Benedikt Wolters ◽  
Oliver Hohlfeld ◽  
Klaus Wehrle
Keyword(s):  


Sign in / Sign up

Export Citation Format

Share Document