whird.jpope.org cover image

Plugging the pumps - BashScriptVille Part III

Crossposted from my soon-not-to-be-existing pump: link I think the time has just about come for me to shut my personal pump down (io.jpope.org). I'll likely migrate back to my pump.jpope.org account for a bit (jpope@pump.jpope.org). But, to be honest, I've been considering shutting that pump down as well. After that, maybe I'll pop into my old identi.ca account from time to time. So, if you start seeing some mass un-following from this account, this will be why. As far as why? These days, Diaspora* does everything I need/want in a distributed social network. The privacy features work properly (unlike StatusNet err... GNUSocial), both the web and mobile UIs work properly (unlike pump.io). Plus, the API is finally on it's way. Also, with having the bulk of my stuff running on various VPSs, I need to trim the fat some to cut costs. Considering my interest in pump.io has been waning for quite some time, it definitely falls into the fat category for me. And I won't even get into my feelings…

BashScriptVille Part II

I've now added a new pumpio (and statusnet) bot that pulls comics from XKCD. Why? Why the hell not? Don't we all love XKCD? The "bots" are located at XKCD@pump.jpope.org and xkcd@sn.jpope.org if you want to follow. The script I wrote for these two can be found here. For pumpio, it does use a custom script based on one of the scripts included in a standard pumpio install, My scipt allows for titles to be added to a regular note and can be found in my Github Gogs repo for pump.jpope.org. The pumpio script also has a small mod to post the notes publicly instead of only to Followers (thanks @jrobb!). The bot script also captures the newest posted comic and stores the comic id in a small file so the next time the script is run, I don't end up posting the newest one multiple times. Well, hopefully I don't. ;) The script also formats the pumpio version in html and in textile for use with the Textile plugin on statusnet. I now have 11 pumpio bots that live in BashScriptVille: apod@pump.jpope.org bofh@pump.jpope.org f…

BashScriptVille

I have had my matrix StatusNet instance up for quite some time. This instance consists of six accounts that mostly spit out random notices that are pulled via various bash scripts. I decided it was time to recreate some of this for pump.io. After installing pump.io, there is a handful of cli tools in /pump.io/bin. With these scripts, you can register a user, post a note, follow another user, etc... Using the scripts, I've created a few (at the time of this writing, there are 10) accounts to post random posts of useful (or is it useless...) tidbits. Off to the commandline... The first step for me after deciding on what crap to pull, is to write a bash script to pull the said (sed?) crap. Using the good ol' tools curl, cat, grep, sed, awk, etc... I'll parse whatever page down to the actual message that will be sent. For the rest of this, I am assuming that: the /pump.io/bin directory is in my $PATH. That registration is open for the target pump. Also, since pump.io is currently being heavily de…

Wunderbot Update

This is a quick update to announce that my Wunderground bot plugin for StatusNet has gotten a couple of new features since it's initial release. The first one is that if a US zip code or Canadian postal code isn't supplied in the notice to the bot, it'll attempt to get the users location from the notice. This info is attached to the notice only if it has been allowed by the user (it may be turned on by default, I can't quite remember). To turn it on, you have to check the "Share my current location when posting notices" checkbox in the settings (i.e. http://identi.ca/settings/profile) and put a valid location in the Location field. Another option is to have your browser or client (some clients such as Mustard have this option) attach the location information. If the notice has the location information (which is in latitude/longitude format), the bot will pass the lat/long to the Wunderground API and it'll return the current conditions appropriately. One thing to note is that the bot will use a provided posta…

WunderBot

To continue my recent work with the Wunderground api, I have now hacked together a new StatusNet bot plugin. With this plugin, you can ask the bot for the current weather conditions with just the word 'weather' and the five digit US zip code of your choice. So long as Wundergound recognizes the zip code, it should return some basic details. This plugin is currently active over on my Matrix instance, just ping @Niobe. @jpope It is currently Overcast and feels like 64.0 F (17.8 C) in Jefferson City. The humidity is 58% and the wind is From the NNE at 2.9 MPH Gusting to 5.8 MPH. #cloudy http://www.wunderground.com/US/MO/Jefferson_City.html Niobe (niobe)'s status on Monday, 17-Sep-12 22:18:02 CDT - matrix.jpope.org I have looked to do this before but, hadn't had much success. Mostly, my lack of proper coding skills has usually been my holdup. Previously, I found a bot on identi.ca (source) that already did what I've done here, except the execution is quite different. That bot (which currently does…

Weather Bots

This has been me off and on the past couple of days: Playing with bash and the wunderground api. Jeremy Pope (jpope)'s status on Sunday, 02-Sep-12 10:13:33 CDT - micro.jpope.org My Matrix bots have been providing me with a few weather details ever since I set that StatusNet instance up. I have a little bash script that will pull the data on a set schedule via cron. Previously, I had been using google's "secret" weather api as it pulled quickly, is easily parsed, didn't require an account and had just the few details that I wanted. Judging from the error message I get when attempting to open the xml feed in my browser, they've blocked my IP and or domain: After a quick search, it turns out the api was shut down. This has also had an effect on my XMPP jsonbot as it uses the google feed for it's weather as well. :( So, now it was time to find another place to pull from. I prefer to pull from an api as opposed to scraping a page somewhere. I really hate having a script fail due to the page being scrape…