Harald Zeitlhofer About the Author

Harald Zeitlhofer has 15+ years of experience as an architect and developer of enterprise ERP solutions and web applications with a main focus on efficient and performant business processes, usability and application design. As a Technology Strategist in Dyatrace's Centre of Excellence team he influences the Dynatrace product strategy by working closely with customers and driving their performance management and improvement at the front line. He is a frequent speaker at conferences and meetup groups around the world. Follow him @HZeitlhofer

How to Spruce up your Evolved PHP Application – Part 2

In the first part of my blog I covered the data side of the tuning process on my homegrown PHP application Spelix: database issues, caching on both the server and the client.

By just applying these insights I could bring Spelix to a stage where the number of users could be increased by more than 150%, and user experience could be improved to make existing users eager to work with the system. By having more users contributing and delivering input, Spelix became more and more a platform for the caving community to meet and share their work. However, I did not stop my mission there. There was more to be done.

Spelix offers a lot more functionality now thanks to performance improvements.

Spelix offers a lot more functionality now thanks to performance improvements. 

In this part, I will concentrate more on technical topics: network traffic, code caching and session handling.

Step #4: Reduce Network Traffic

Functional extension of your application does not only require additional PHP code on your server, but also demands new logic on the client. More and more of these requirements can be served by using JavaScript libraries. By just adding them to your application, the client has to load all these scripts and stylesheets from the server.

The Browser Timeline view shows what’s happening when a page is loaded. This example is a visualization of loading the start page.

The Browser Timeline view shows what’s happening when a page is loaded. This example is a visualization of loading the start page.

To download the start page, it takes a total of 79 roundtrips to download JavaScript, CSS and image files; 64 to a single domain. This is quite a lot.

The Browser Performance Report is a helpful tool to easily identify unnecessary roundtrips on your network.

The Browser Performance Report is a helpful tool to easily identify unnecessary roundtrips on your network.

When we check the timeline in detail we even get to see the actual impact of having that many requests. Too many simultaneous web requests can’t be processed in parallel as browsers have a maximum number of connections per domain to download resources. The more resources on a single domain, the more resources get queued up and need to wait. Typical numbers for parallel connections are 6-10, depending on your browser.

Tip: Consider these numbers to split or merge your resource files. You will benefit from the advantage of parallel data transfer without the overhead of too many requests.

The network dashboard shows requests per physical browser network connection, and all these are sent to spelix.at. Note that your browser limits the number of simultaneous http connections to a single domain!

This timeline visualization really helps to understand how the browser downloads resources. The more requests per domain the longer it takes due to connection limitations

This timeline visualization really helps to understand how the browser downloads resources. The more requests per domain the longer it takes due to connection limitations

By merging these files into fewer container files you can avoid a lot of network redirections, as every single file creates a request to the webserver. Be sure to define the content of your container files properly, don’t pack everything into one file!

Tips & Tricks:

  • As an example, JavaScript for map generation is not required on pages that don’t contain maps, and therefore the respective code does not need to be loaded!
  • Pack your icons into sprites! One container image file for your small icon files creates only one server request compared to one request per icon! In your CSS simply define the section of the sprite that contains your icon.
  • Get some more tips and tricks by checking out Best Practices on Network Requests and Roundtrips!
  • Avoid physical network requests by setting up proper caching, as described in the next step!

Step #5: Leverage Browser / CDN cache

On top of the unnecessary network traffic, the files from our example are not even cached, as shown in the browser network section:

Too often do we forget to set proper cache headers. Getting an overview per web request makes it easy to spot resources that have no or inadequate response headers set.

Too often do we forget to set proper cache headers. Getting an overview per web request makes it easy to spot resources that have no or inadequate response headers set.

Data can be cached in your browser or even throughout the CDN (content delivery network) between the server and the browser in a proxy server. Be sure to set reasonable HTTP headers to cache your data properly. In PHP you could do this by using the header() function:

 evolvedphpapp6

These examples store your HTTP response in the cache and keep it valid for 1 day.

Alternatively you could use Apache’s mod_expires module to set the expires-header without having your code changed. Add following lines to your httpd.conf to enable and configure the module:

evolvedphpapp7Another option is to set your response header in the .htaccess file:

evolvedphpapp8

Conditional and Non-Conditional Caching

If you are using the ETag field in your response headers, the ETag value is stored in your browser cache together with the document. In a new request the browser sends that value back to the server to detect changes of the resource. If the document is unchanged, the server sends a 304 Not Modified in the response header and an empty body. The browser then uses the cached document. This is known as conditional caching.

Exploring HTTP Headers for each resource allows you to understand how browser and server negotiate resource cache settings.

Exploring HTTP Headers for each resource allows you to understand how browser and server negotiate resource cache settings.

As long as your resource in the cache is not expired, the browser won’t send a request to the server (unless you press reload) and fetch the document from the cache. This is what we know as unconditional caching.

If browser has a cached version of a resource it simply takes it from the local cache instead of doing a roundtrip to the server

If browser has a cached version of a resource it simply takes it from the local cache instead of doing a roundtrip to the server

HTML5 Application Cache

Another technique of caching has been introduced with HTML5: Application cache. In a cache-manifest file you define the documents to be stored in the browser. These documents can be used then for complete offline browsing. In your main document just refer the manifest file:

evolvedphpapp11 By just applying reasonable caching rules we get much better results:

The timeline shows that the total request can be processed much faster now.

The timeline shows that the total request can be processed much faster now.

In the performance report we can see the details of the performance optimization process. The business data received via AJAX calls are not affected by this code caching logic, but by local data caching using LocalStorage as described in Step#3 of Sprucing up your grown PHP application – Part 1

In the performance report we can see the details of the performance optimization process. The business data received via AJAX calls are not affected by this code caching logic, but by local data caching using LocalStorage as described in Step#3 of Sprucing up your grown PHP application – Part 1

If you want more tips and tricks check out Best Practices for Browser Caching

Step #5: Optimize Session Handling

In a good application design, your browser content is divided into structure, layout, business logic and data. Data is commonly fetched from the server using AJAX calls, and most likely these requests are performed in parallel, interacting with the same HTTP session.

3 AJAX calls on the start page show almost the same long execution time. This looks suspicious…

3 AJAX calls on the start page show almost the same long execution time. This looks suspicious…

In Spelix we have such logic in the start page, which shows a map and loads further data via AJAX. But we found that these requests are rather slow, and all of them take almost the same time.

Starting your session in PHP with session_start() reads your data from the file system into the $_SESSION variable and locks the file for further usage. A parallel process in the same session can’t access the file unless the locking process has released the lock. This can slow down your entire application when a slow process delays other processes in execution, and this is what happened in Spelix. In an AJAX call that just requests data it’s generally not required to modify the data in $_SESSION. Make sure to release the lock right after dealing with your session data by

  •  lock and release session files (important when handling parallel requests in a session by multiple AJAX calls)
  • using data cache for session data (memcache)

evolvedphpapp15

Now It’s Your Turn

It’s easy to trace your application and identify the hotspots that might cause bad performance measures. Download our free trial of dynaTrace or the free dynaTrace Ajax edition to start monitoring your performance and share with use your experiences in our forum on our community!

 

 

Comments

*


one + = 4