So your PHP application is running slow… no… scratch that – slow still implies that it seems like your application is doing something after 45 seconds of loading. No, your application is a crippled duckling, dragging itself slowly towards the shoreline so it can end it all. What do you do??? Here are a few quick steps to help:
Add Log Points
Create a function that writes a message to a file. Then go through the code and add calls to this function at strategic points (i.e. after a particularly large query). In the message, dump the date and time, the __LINE__ constant (which simply outputs the current line number being executed), and a brief description of what happened since the last message. If YOU can reproduce the speed problem, then it also helps to make the function only write to the log file when your IP is the one visiting the application, so your log file doesn’t fill up too quickly or with other data.
Once the log file has some data in it, you should be able to see the flow of the program and be able to determine chunks of code that are running slow. Continue to refine the locations of the function calls to drill down to the problem points.
Improve Your SQL Queries
In many cases, a slow application is due to slow queries. Often, slow queries can be DRAMATICALLY improved with some very minor and safe tweaks to the database table indexes. I can’t begin to count the number of queries I’ve seen that tried to join two large tables using fields that were not indexed. There are several things to do to improve performance, but simply indexing those fields can often make a HUGE difference in query speed. Some databases allow for more specific indexing options that can make additional improvements, but nearly every database has basic indexing.
Speaking of joining tables, data types can also play a large part in performance. Joining tables on numeric field types like INT is usually much faster than joining on VARCHAR fields (although you should be VERY careful about a decision to change a VARCHAR to a numeric field). This is why it’s a good habit to add auto-incrementing, numeric ID fields to the tables you create. However, data types aren’t just important when joining. Minor improvements can be made by making sure that you’re using the right data types to store things. There’s no reason to use a BLOB or TEXT field to store a Unix timestamp, a first name, or a tiny on/off flag (would you use a crate to hold a tiny pebble?).
If you have a query with a WHERE clause that looks up more than one field, and is looking through a single, big table, then consider making a multi-field index that contains each of the fields used in the WHERE clause.
Some databases, like MySQL, have additional features that allow you to discover problematic queries. These features include things like automatically logging any queries that take longer than a certain number of seconds, or commands that will show details about the query that you’re running. For example, if you’re using MySQL, take a slow-running SELECT query and simply add the word EXPLAIN before the query. The result is a description of how MySQL runs the query, what indexes it uses (if any), and other useful information.
There are too many tricks to list here, but it’s not difficult to find out even more simple ways of optimizing your queries and your database performance. If the simplest approaches don’t fix the problem, then you may be facing a hardware issue or something more complex. Hiring a temp DBA may be a good idea here.
Use Datasets
In cases where you might be re-using a set of records from the database more than once, consider copying those records into a multi-dimensional array, using a primary key (or something else appropriate) as the index of that array. This essentially creates a “cached” version of that recordset that you can use throughout the rest of the script. When it comes time to loop through those records to generate a dropdown box or refer to a value, then you don’t need to go back to the database again. This can also help eliminate an additional JOIN from your queries if all the data you need is in that array. Datasets are most effective when they’re small so they don’t take up much memory and don’t take too much time to loop through.
An example of a good dataset would be a list of car manufacturers (not that many records, and possibly re-used multiple times throughout the rest of the page).
An example of a bad dataset would be an inventory of cars (probably too many records, and you probably wouldn’t re-use them on the same page).
Reduce Output
I’ve seen a lot of scheduled jobs / cron job scripts that print out a lot of output, and some of it includes calculations and additional processing simply for the purposes of outputting to the screen. But if the output isn’t been seen by anyone or processed by anything, then why send the output? Output is especially draining when it’s inside large loops, which brings us to that topic.
Take Back Control Over Loops
Lots of scripts have processes with loops that have tens of thousands, hundreds of thousands, even millions of iterations. This means that every improvement you make is multiplied times the number of times that loop runs. If you have some old debugging code that opens a log file, writes to it, and closes the file, then running that 100,000 times as fast as possible is going to be a real big system hit. Try as hard as possible to NOT run SELECT queries inside loops, because it often means loops within loops (exponentially increasing the speed hit). Even simple things like a substr(), in_array(), or strpos() call can take a bit of processing time when you run them a million times. But if you’re performing the same function with the same variables over and over again, then consider storing the result in a variable and checking that variable instead:
Before:
$MyText = “The Quick Brown Fox”;
LOOP BEGINS
if(strpos($MyText,”Quick”))
{
// do something
}
LOOP ENDS
After:
$MyText = “The Quick Brown Fox”;
$QuickIsInMyText = strpos($MyText,”Quick”);
LOOP BEGINS
if($QuickIsInMyText)
{
// do something
}
LOOP ENDS
I try to get into the habit of creating boolean flags like $QuickIsInMyText. If you name the variables correctly ($IsAdmin, $HasEditingPrivileges), they make the code easy to read and eliminate possibilities of rewriting code over and over again.
Install and Use XDebug
XDebug (http://xdebug.org) is a free extension for PHP that can be a godsend. It’s usually easy to install without recompiling PHP, and offers a slew of features for finding performance issues with your application (although it is best used in a development environment, NOT in a production environment).
One of the most valuable features of it is its profiler, which will basically attach a little homing device to PHP so when PHP goes to execute your application, the homing device follows it all the way through and logs everything to a file. You end up with a file that shows you the details of every line of code that was executed in your application, and how long each line took to run. Sounds useful but complicated, right? Well, it is… if you were to look at the file manually.
The file that gets generated is called a cachegrind file, and it’s pretty big, and is not meant to be read as-is. Instead, there are free programs out there like KCacheGrind (for Linux) and WinCacheGrind (for Windows) which will read a cachegrind file, and display it in an easy-to-understand fashion. You can see a top-level view of the major points in your program, and drill down into the areas that are taking more processing power, down to the exact line. It’s pretty much like a super-charged version of the Log Points I mentioned earlier.
Hopefully these tips will help you get on your way to making your application run faster. Good luck!