http persistence
play

HTTP Persistence & Server parses request, responds, and closes - PowerPoint PPT Presentation

COMP 431 HTTP Protocol Design Internet Services & Protocols Non-persistent connections The default browser/server behavior in HTTP/1.0 is for the connection to be closed after the completion of the request HTTP Persistence &


  1. COMP 431 HTTP Protocol Design Internet Services & Protocols Non-persistent connections ◆ The default browser/server behavior in HTTP/1.0 is for the connection to be closed after the completion of the request HTTP Persistence & » Server parses request, responds, and closes TCP connection Web Caching » The Connection: keep-alive header allows for persistent connections Jasleen Kaur ◆ With non-persistent connections at least January 30, 2020 2 RTTs are required to fetch every object » 1 RTT for TCP handshake Web Server » 1 RTT for request/response Browser RTT == “Round Trip Time” (Time to send a message and receive a response) 1 2

  2. Non-Persistent Connections Non-Persistent Connections Performance Performance transmission transmission A A propagation propagation B B queueing queueing nodal nodal processing processing ◆ Example: A 1 Kbyte base page with five 1 Kbyte embedded ◆ With non-persistent connections at least images coming from the West coast on an OC-48 link 2 RTTs are required to fetch every object Web Server » 1 RTT for TCP handshake = 0.001 ms + 50 ms » 1 RTT for TCP handshake » 1 RTT for request/response = 0.006 ms + 50 ms » 1 RTT for request/response ◆ Page download time with non-persistent connections? Browser ◆ Page download time with a persistent connection? 3 4

  3. Persistent Connections Non-Persistent Connections Persistent connections with pipelining Parallel connections ◆ To improve performance a browser can issue multiple requests in parallel to a server (or servers) » Server parses request, responds, and closes TCP connection Req 2 Req 1 Req 3 Request 3 Request 2 Request 1 Request 1 Request 2 Request 3 Resp 1 Resp 3 Resp 2 Response 3 Response 2 Response 1 Response 2 Response 3 Response 1 Web Web browser server What is the page download time in previous example with a pipelined connection? Web Web Server Server ◆ Pipelining in a persistent connection allows a client to make a next request before a response to the previous request has been received » Connections are persistent and pipelined by default in HTTP/1.1 ◆ Page download time with parallel connections? ◆ Page download time with a persistent, pipelined connection? » 2 parallel connections = » 4 parallel connections = 6 7

  4. HTTP Protocol Design HTTP User-Server Interaction Persistent v. non-persistent connections Browser caches ◆ Non-persistent » HTTP/1.0 Internet » Server parses request, responds, and closes TCP connection browser » At least 2 RTTs to fetch every object origin server ◆ Persistent » Default for HTTP/1.1 (negotiable in 1.0) miss » Client sends requests for multiple objects on one TCP connection Internet hit » Server parses request, responds, parses next request, responds... » Fewer RTTs origin server Browser with ◆ Parallel vs. persistent connections? disk cache ◆ What is my browser doing? ◆ Browsers cache content from servers to avoid future server » Chrome –> Inspect –> Network –> W aterfall interactions to retrieve the same content » Wireshark ◆ Caching-related issues? 8 9

  5. HTTP User-Server Interaction HTTP User-Server Interaction The conditional GET The conditional GET ◆ If object in browser cache is ◆ If object in browser cache is Client Server Client Server “ fresh, ” the server won’t re- “ fresh, ” the server won’t re- send it send it » Browsers save current date along » Browsers save current date along HTTP request with object in cache with object in cache object If-modified-since: <date> not ◆ Client specifies the date of HTTP response modified cached copy in HTTP request HTTP/1.0 304 Not Modified If-modified-since:<date> ◆ Server’s response contains the HTTP request object only if it has been If-modified-since: changed since the cached date object <date> modified HTTP response ◆ Otherwise server returns: HTTP/1.0 200 OK HTTP/1.0 304 Not Modified … <data> 10 12

  6. HTTP User-Server Interaction Cache Performance for HTTP Requests Cache Performance for HTTP Requests What determines the hit ratio? Cache ◆ Cache size Browser with Miss disk cache Network ◆ Locality of references Cache Hit Cache » How often the same web object is requested Browser with Origin Server hit? disk cache ◆ How long objects remain “ fresh ” (unchanged) ◆ What is the average time to retrieve a web object? ◆ Object references that can’t be cached at all » T mean = hit ratio x T cache + (1 – hit ratio ) x T server » Dynamically generated content where hit ratio is the fraction of objects found in the cache » Protected content » Mean access time from a disk cache = » Content purchased for each use » Mean access time from the origin server = » Advertisements ( “ pay-per-click ” issues) ◆ For a 60% hit ratio, the mean client access time is: » Content that must always be up-to-date » (0.6 x 10 ms ) + (0.4 x 1,000 ms ) = 406 ms 13 14

  7. The Impact of Web Traffic on the Internet Traffic Makeup on UNC Link MCI backbone traffic in bytes by protocol (1998) Inbound traffic (2016) ◆ Note the dominance of HTTPS over HTTP ◆ Also note that “streaming” excludes streaming done over HTTP 16 17

  8. Caching on the Web Why do Proxy Caching? origin Web caches (Proxy servers) The performance implications of caching servers ◆ Web caches are used to satisfy client requests without contacting ◆ Consider a cache that is “ close ” to client the origin server » e.g ., on the same LAN ◆ Users configure browsers to send Origin public Internet server all requests through a shared ◆ Nearby caches help with: proxy server client Proxy » Smaller response times » Proxy server is a large server cache of web objects » Decreased traffic on egress link to 1.5 Mbps institutional ISP (often the primary access link bottleneck) ◆ Browsers send all HTTP requests to proxy 10 Mbps LAN » If object in cache, proxy returns object in HTTP response To improve Web response times » Else proxy requests object from should one buy: origin server, then returns it in client campus proxy Open research question: HTTP response to browser network - a 10 Mbps access link? server How does the proxy hit ratio - or a proxy server? change with the population of users sharing it? 18 19

  9. Why do Proxy Caching? Why do Proxy Caching? The performance implications of caching origin The performance implications of caching origin servers servers ◆ Web performance without caching: ◆ Web performance without caching: » Mean object size = 50 Kbits » Mean object size = 50 Kbits public public » Mean request rate = 29/ sec » Mean request rate = 29/ sec Internet Internet » Mean origin server access time = 1 sec » Mean origin server access time = 1 sec » A verage response time = ?? » A verage response time = ?? 1.5 Mbps 1.5 Mbps access link access link campus campus ◆ Traffic intensity on the access link: network network 10 Mbps LAN 10 Mbps LAN reqs 50 Kbits/ req X 29 = 0.97 sec 1.5 Mbps 20 23

  10. Why do Proxy Caching? Why do Proxy Caching? origin The performance implications of caching The case for proxy caching origin servers servers ◆ Upgrade the access link to 10 Mb/s ◆ Lower latency for user’s web requests » Response time = ?? ◆ Reduced traffic at all network levels public » Queuing is negligible hence response time = 1 public Internet ◆ Reduced load on servers Internet sec (+ 10 ms) ◆ Some level of fault tolerance ◆ Add a proxy cache with 40% hit ratio and (network, servers) 10 ms access time 1.5 Mbps ◆ Reduced costs to ISPs, content » Response time = ?? 1.5 Mbps access link providers, etc ., as web usage continues access link » Traffic intensity on access link = to grow exponentially campus 0.6 x 0.97 = 0.58 ◆ More rapid distribution of content network 10 Mbps LAN 10 Mbps LAN » Response time = 0.4 x 10 ms + 0.6 x 1,089 ms = 653 ms campus proxy network server proxy server ◆ A proxy cache lowers response time, lowers access link utilization, and saves money! 24 25

  11. HTTP User-Server Interaction HTTP User-Server Interaction Authentication Cookies ◆ Problem: How to limit ◆ Server sends “ cookie ” Client Server Client Server access to server documents? to browser in response » Servers provide a means to message usual HTTP request msg require users to authenticate themselves usual HTTP request msg Set-cookie: <value> usual HTTP response + 401: authorization ◆ HTTP includes a header tag WWW authenticate: Set-cookie: S1 ◆ Browser presents cookie in for user to specify name and later requests to same server password (on a GET request) usual HTTP request msg cookie: <value> cookie- usual HTTP request msg » If no authorization presented, cookie: S1 + authorization: specific server refuses access, sends WWW authenticate: action ◆ Server matches cookie with usual HTTP response msg usual HTTP response msg header line in response server-stored information » Provides authentication usual HTTP request msg ◆ Stateless: client must send cookie- usual HTTP request msg » Client-side state maintenance cookie: S1 authorization for each request + authorization: specific (remembering user preferences, » A stateless design usual HTTP response + previous choices, …) action usual HTTP response msg » (But browser may cache credentials) Set-cookie: S2 Time ◆ Chrome: Inspect –> Application –> Cookies 26 27

Recommend


More recommend