Quantcast
Viewing latest article 5
Browse Latest Browse All 6

Counting Twitter followers

TwitterCounter, the service that tells you how many people followed a given Twitter user on a given date (among other things) has an API – so I thought I’d take a look at it to see whether I could create a quick automated table of rankings.

Here’s the simplest way to query the API:

[code]

http://twittercounter.com/api/?username=mediaczar&output=xml

[/code]

Just cut and paste that into the address bar of your browser for example. Fairly simple. Change the username and you’ll get the data for a different user. Here’s what you get back from the API — an XML file with lots of rich meaty data:

First attempt: Google Spreadsheet’s importXML function

As part of this project, I was particularly interested to explore Google Spreadsheet’s useful importXML function. Google lets you pull data out of an XML document anywhere on the web and put it into a spreadsheet cell.

There’s a pretty average bit of support documentation from Google on this function and a peculiarly hard-to-understand description of the XPATH reference that you can read if you want to, but all you really need to know is that you can address any item in the document using a double slash like so: //item_name.

So if there’s an item called followers_yesterday in the document above and you want to access it

[code lang="xml"]652
[/code]

I can access it like this

[code]=ImportXML("http://twittercounter.com/api/?output=xml&username=mediaczar","//followers_yesterday")[/code]

So with not much work, I can create a Google spreadsheet that will do this for a list of twitter usernames

Wahey! That’s great! But what’s this?

Image may be NSFW.
Clik here to view.
Google doesn't let you make more than 50 importXML calls in a single spreadsheet

I’ve got more than 200 people on the list I want to rank. Foiled by this apparent petty-mindedness on Google’s part, I decided to try again, this time using Yahoo! Pipes.

Second Attempt: Yahoo! Pipes

First of all I make a simple pipe to build the URL and make the API call for a single user (I’ve learned to follow this modular approach from looking at pipes built by the Open University’s Tony Hirst.)

Image may be NSFW.
Clik here to view.
Using Yahoo! Pipes to call the TwitterCounter API

Now all I need to do is plug this into another pipe. I’m pulling the data from a Google spreadsheet list of Twitter usernames published as a text file. Whenever I update the spreadsheet, of course, the text file is updated dynamically.

Image may be NSFW.
Clik here to view.
Adding the API call into a loop on a new pipe

The first FetchCSV module pulls that text file in and passes the data to the next module as “username.” The Loop module loops through each username on the list and passes it through the TwitterCounter API Call module I just built. All the XML data from the call will be passed on as “item.data” Now all I need to do is select the bits I want and format the data for output. Should be simple.

Image may be NSFW.
Clik here to view.
Rename the elements and output the data

The most important thing to know about pipes is that we’re usually working with RSS (there are ways around this.) That limits us to only a few fieldnames; generally title, link and description.

Here I’m using the Rename module to set

username

as the title, and

item.data.followers_yesterday

as the description. Let’s save the pipe and test it out.

Image may be NSFW.
Clik here to view.
Running the Pipe. Success!

Success! Now I can take the RSS feed for this and (using Googles importRSS function) drop it back into another spreadsheet.

Only right now, I can’t work out how to to do that for more than twenty items. So I’ve actually got fewer results than I had before.

I’ll continue plugging away at this and post any solution I come to here – but in the meantime if you’re reading this and know how to fix my issue with Google Spreadsheets, please help. There’s some hope held out by this article on Sphinn which I’ll struggle along with.

In the meantime, I can output the results as a CSV so that’s OK.


Viewing latest article 5
Browse Latest Browse All 6

Trending Articles