All the bloggers start out small. You won’t start blogging by paying over $20 for hosting per month. So, you go for cheap shared hosting. That is a very nice solution but it has a small problem. You don’t get to have a console access to the machine that your blog is hosted (except for BlueHost and a few others). Although in most cases you won’t care about it, there are a few times where such an access is more than valuable and time saving.
The most common problem with a blog powered by WordPress, is not being able to use the automatic update. When trying to access it, it asks for a username and password of the FTP/SFTP to use in order to update. That of course ruins the simplicity of the automatic update. This problem is caused by the file permissions on the server. When you open an FTP connection to your site, with a major FTP client, you must see a column there with the file permissions. Those must be set, so as to give the web server the permission to write on your files. That is done by adding “write” permission to the group. When you don’t have ssh access to your Linux powered server, that can be troublesome. Using the FTP client will take too long, so you end up updating by hand. Here comes the powerful “exec” command.
This actually is a PHP function that the argument you pass, it passes it down to the operating system. So, if you wanted to change the file permissions in bulk, without using your FTP client, here is what you will need to do. First, create a PHP script with your favorite text editor (i recommend notepad++) containing the following:
<?php
exec("chmod -R 775 ./");
?>
Then save it, naming it for instance “foo.php“. Upload this file to the root of your WordPress installation directory. Finally, invoke the script by visiting the page “http://www.yoursite.com/foo.php“. Voila! This should have taken care of the file permissions. Try out the auto update function now…
One more thing that is painful is the backup. The modern websites nowadays consist of hundreds or even thousands of small files (scripts, images etc). This makes it very hard for an FTP client to backup since, it has to send at least a batch of three commands to the server in order to get one single small file. Then visit each sub-folder and again and again… Even if your connection with the server is very speedy, the backup process this way will be extremely slow. In order to speed the process up, you can archive the whole site in one single file and then download that one. That would be very fast. Again, without a console, things are a bit hard. But, “exec” comes to the rescue again. You can use the above process. Create a script file containing:
<?php
exec("tar -cf backup.tar ./");
?>
That’s it! Upload and run. This will create a file called “backup.tar” on the root directory of your blog. Just download it! Now, there are two things i’d like to point out:
- There is a possibility of the execution time of the script running out before getting the job done. In that case you can archive each directory on it’s own. That will save some time.
- The compression on the .tar is not as high as expected. Actually it’s not good at all. In order to compress it better you can use the .bz2 format by converting the above script to the following:
<?php
exec("tar -cjf backup.tar.bz2 ./");
?>
That will highly compress it but it will consume more time (see point 1).
One final thing that seems to be taking a lot of time is folder deletion. For the same reason as backup, deleting a folder and all it’s contents through FTP, can be time consuming. The client has to visit each folder issuing a command for each file, waiting for a response. That takes much more time than issuing an “rm” command from the console. On shared hosting, you guessed it, exec comes to the rescue. Let’s say you want to delete the folder “bar” which has a lot of subfolders in it with many files. Here is a script that can do the job:
<?php
exec("rm -rf bar");
?>
Be very very careful with this one fellas. The “rm” commands means “remove” and the “-rf” option means “recursively and force“. In other words, if you don’t use the right folder name, something can go extremely wrong (for instance using ./ would delete all the contents from the current folder).
When you are done with your maintenance be sure to delete those command scripts, just in case someone stumbles on them and decides to check them out. All in all, this process will save you a lot of time of maintenance on shared hosting, i know it saved me!
Photo by ryancr
HI Stratos,
I don’t think the auto update problem is simply an issue of file permissions. I changed them, tried it again and then changed them back when it still didn’t work.
Also, with this script everything is set to 755 – I don’t want all my files to have that level of permissions.
I’m still going to try it but I’m not hopeful.
Yeah – it’s something else with my host. I made the script, uploaded it and it changed all of the permissions but I am still being asked got my ftp login when I auto-upgrade a plugin.
This will probably help a lot of people but there is something else going on with my host. I need to change hosts but I’ve been too lazy to deal with it.
@Kim: There is a slight possibility that the web server’s user does not belong to your group. So, the rights “775” won’t to. You will probably have to go with “777”. I know it’s not pretty but try it out just for the sake of argument and let me know. After you are done you can return them yo the normal “755”. If that fails let me know…
I agree with Kim, even for me it asks for username/password everytime, even when I have set the rights as “777”.
Actually I have another problem w.r.t file permissions. When I use a plugin (one click installer) to install another plugin, it creates the files as “nobody” instead of my FTP username. So everytime I need to chown to my ID. May be I will use exec to chown the folder as you have suggested!
@Raju: Well the plugin set’s the UID to be the same of the web server. Now, i have no idea if you want to change that and i am pretty sure you can’t 😉 (unless you have root privileges on the system running your site).
Yes, I do have the root access. But I wonder why UID is set as “nobody”. Anyway little bit of trial n error is needed I suppose
@Raju: Well having the root makes it easier to delete. As for the nobody, it’s the default user for the web server on OS’s like CentOS. On debian based systems it’s www-data. So, the scripts changes the ownership to the user of the web server…
Automatic updates is something that was never a problem for me stratos. I reckon that at least I was lucky in that respect.
I don’t think that starting on the cheapest hosting plan is a good idea. Even if you’re not expecting a high volume of traffic just yet you probably want to get there at some point. I think it’s better to save yourself some trouble in the future and pay for something a bit better.
@Sire: Well it should work. When it doesn’t then the host has done something wrong like in Kim’s situation.
@Gry: When i started blogging i had no idea if it was for real or it was something for fun. It took more than four months to see that i was constant and i was digging the whole process… So, it all depends on how you start i guess…
Some useful snippets there..may have to file them away for later 😉
This is great. For any file operations i use filezilla which is slow as my internet connection is also slow.
This tips are great…
Thanks.