I’m on Reddit more than I would care to admit. One of my favorite parts of the website is the subreddit EarthPorn. Despite having the word “porn” in its title there is nothing obscene. It is simply a collection of high resolution photographs of some of the coolest places on Earth. I found myself occasionally setting a particularly awesome image from the website as my desktop background for a while.
Example of one of the awesome images on EarthPorn. Credit to /u/xeno_sapien.
Even though OS X makes this pretty easy easy, I figured I could do better. What I ended up writing was a short Python script that will automatically pull down the top rated images from EarthPorn and place them in a folder. I have my Desktop settings change the desktop background to an image in that folder and switch every hour. The script runs at startup so new images are pulled down each day. When everything in place this gives me a completely new desktop background every hour!
The script itself is pretty simple, less than 100 lines. It uses urllib, a default library for Python that helps fetch data from the web.
The first thing in the script is a delay (2 minutes) to give the computer time to establish an internet connection. I don’t check if there is an active connection before proceeding, but I probably should. The next thing I do is delete the old images from the images folder so the folder will just contain the most recent images about to be pulled down.
for the_file in os.listdir(folder):
file_path = os.path.join(folder, the_file)
except Exception, e:
Note: I’m having a bit of trouble formatting the whitespace in these code snippets. Of course, this code would have to be properly spaced in order to run in Python.
Python connects to a page in Reddit’s EarthPorn that displays the top 50 images of the day. Then
urllib scans the page looking for image links. Currently I only look for JPG images hosted on the image sharing website Imgur. This does leave out some images like those hosted on Flickr or formatted in PNG or the like, but it gets most of the images.
urlString = "http://www.reddit.com/r/EarthPorn/top/?sort=top&t=day"
htmlSource = urllib.urlopen(urlString).read().decode("iso-8859-1")
urlList = re.findall("http://i.imgur.com/\w+.jpg", htmlSource)
Interestingly enough, findall will return two links to each image. I think this is just the way the Reddit page is set up. When I go to grab images, I simply use every other link in
urlList. Each image is saved with the date and an index starting with 0.
for index in range(0,numUrls,2):
url = urlList[index]
fileName = "img/" + time.strftime("%m") + "_" + time.strftime("%d") + "_" + "%d.jpg" %imgName
imgName += 1
There are many ways to get the script to run automatically at startup but since I am using OS X I decided to use Automator, a utility to automate workflows. I created a new workflow that simply runs the Python script. I then exported the workflow as an application. Under System Preferences -> Users & Groups -> My Account ->Login Items I added the new application. Now every time I login the workflow runs and grabs new images.
I went into my Desktop & Screensaver settings and added the folder that Python pulls the images into. I then set up the desktop background to change every hour to keep the images it displays fresh and new!
There are a lot of ways that I can improve this script. If there is no internet connection the script will just delete all your images and fail to pull new ones. Sometimes (though not usually) the images posted to EarthPorn are not of desktop quality. I would like to be able to check the resolution of the images and delete the ones that will look bad when set as my background. I could either use Python Imaging Library to do this. Alternatively, EarthPorn requires that the resolution of each image is posted so I could get more creative with the website parsing and filter the results by resolution as well.
Look for a new update on this script with a bit more functionality soon! The current version is available on GitHub.