programmatically liking things on facebook

Continuing along the lines of using social media as a medium, I’ve been playing around with facebook’s API recently and wanted to “Like” things.  I couldn’t find how to do it (more on that later) so I started to play with client side ways to do it. You know, because sometimes you’re just way pumped and just want to “Like” everything.

Here is a script that collects all of the “Like” buttons for comments and items currently loaded in the browser and clicks them.

 

I used it a couple times and I got this message:

So it’s probably best to include some time delay. I’m not really sure what the acceptable rate of liking is though. Basically if you use it for say <50 things in a second its fine. I did this a lot while writing and testing it. It's when I scrolled down to load a ton of stuff- and then used it- that I got the message. I'd test it again, but I'd rather not have the power to 'like' revoked. If you figure this out, do share!

Also, it turns out you can Like things with the Facebook API. You just find the ID of the item, and then send a POST request to https://graph.facebook.com/ID/likes.
This requires extended permissions to publish actions.

introducing: right now all around

site example image

right now all around is a way to view recent public instagram posts, with the option to retweet posts you like.  It’s kind of a mash up of elements of twitter, Instagram, and tumblr.

There is no filtering of the posts, it is just a sample of all the images posted most recently. The result is a tool of serendipity and exploration. What I find compelling is the juxtaposition of so many different captured experiences, like people watching on a train but spanning across the world, and in so many different settings.

In some cases people who shared photos only had 100 or 200 followers to see them. Now many more can see, share, and comment on the photos.

Instagram made it easier for people to take beautiful photos. right now all around builds on that enabled creativity by creating a collective image stream. In a way the result is visual poetry of what people are doing, feeling, or wanting to remember.

The time of day plays a roll- if you look at the app at 4 AM EST for instance, you will see more photos from southeast asia.

In the non-mobile version updating elapsed time is presented. Usually as you scroll down, you are looking at older bits of content. In this case, with every API call, new images are brought up. If you refresh, you will only see new images. The app is always looking forward, and there is no memory.

I’ve found it a great tool to see new memes, especially one’s that are subculture specific, that I would normally never come across due to my most frequent information channels.  Last year there was a lot of talk of the filter bubble- the danger of psilo-ing from recommendation algorithms.  With the growth of APIs we have more control to determine our information sharing experiences.  This is in part an exploration of that which I hope to pursue in other contexts and media (for example news and Facebook) as well.

See right now all around.

Take a look at the code.

Technical details: I used this as an opportunity to play with the backbone js framework, building a client side app with JSONP from the Twitter API.  No authentication is required, and the API is limited to 150 requests per hour per IP address, as described in the Twitter API documentation.

 

Some close up screenshots of real time geolocated tweets

I built a tool yesterday to observe geolocated tweets as they came in on a world map.  Its kind of cool to zoom in on specific cities.  After just 30 seconds I gathered quite a few in New York.  Clicking on a marker reveals the profile pic and the tweet.

And then zooming out I could see clusters along cities in the Northeast.

After several minutes I zoomed out to see the world map, but at that point there were too many tweets and my browser crashed.  This is where selective coarsening would be useful.

You can find the project on github.

Using nodejs event streams and leafletjs to display geotagged tweets on a world map

I thought it would be cool to see tweets come in live using the Twitter Streaming API querying with location.  When there is a location bounding box, all tweets that come in have geocoordinates (though a small fraction are null).

Initially I wanted to focus in on a city- Leaflet maps look incredible when you zoom into a city- but the Twitter Streaming API was taking too long to fetch tweets while testing.  I set the bounding box to the world.  You can change the twitter fetch bounding box, as well as the initial mapping bounding box.

This is my first time using them, but from what I understand nodejs event streams allow you to send chunks of data to the browser from the server, as they come in.  This is pretty cool for real time applications.  I wanted to focus this application on immediate tweets, and right now there is no database.  Whenever you run it, you get whatever is coming in the twitter pipeline.

Take a look at the project here.

UPDATE: Note that you need to edit the config.js file with a twitter name and password because querying the streaming API with location is otherwise forbidden.  If you are using an account that already has many apps that query the API constantly (tweet harvesting for example), then you may experience a lag in rate of fetching.  This should not be an issue for most people and can be easily remedied by creating another twitter account to query the API with.