So when I last left Picklebot, I had a functioning script that would post to Reddit once every time I ran the script. While that's nice, the purpose of the bot is to do it without human input every time I need it to run. As such, I needed to find a good place to stash the script and have it run automatically.
There are a few options I pursued over the past month in an attempt to find a free solution that wouldn't take a long time to implement. First, I started with the Google Cloud Platform. Google offers many tools and features to host apps in the cloud. However, I ran into a small hiccup with SSL not being supported and scrapped using their platform after a few days. The key here was to find something quick and dirty, not robust.
I turned my sights on AWS and the suite of tools Amazon offers since I had used them in the past and it's what I eventually settled on. The setup was easy, hosting scripts on their S3 buckets was a snap and I was able to get the script up and running in about a day.
I ran into a couple minor bumps hosting the bot on Amazon: Uploading the python package with its dependencies, and making sure I never exceed the free number of hours for running the script. It turned out I had the directory one level too far nested for AWS to recognize my package and was quickly solved by just removing the extraneous folder. The second issue I solved with a cron feature that is built into Amazon's services to only run 3 times a week on the day the playlist is updated.
Overall, getting the bot online and functioning independently was a more daunting task than I initially thought. The work involved is not work that is necessarily difficult, but each hosting service has its own nuance and quirks to work through to achieve what you want a bot to do.
Updating the Place
The initial bot code was sufficient for manually running the bot on my terms but I needed to make a couple changes if it was going to be posting three times a week autonomously. If the playlist has already been posted this week and the bot is run again during the day, it's important to not post the link twice. Conversely, if the playlist hasn't been updated from the previous week, I don't want to post the link to Reddit until the new songs are up. Below are the additional Python methods I use to check when the cron is run:
#A couple of helper methods for parsing the date information returned by Spotify def check_dates(dates): for i, date in enumerate(dates): if(days_since_update(date) > 1): return False return True def days_since_update(d1): d1Date = d1.split('T') d1Time = d1.split('T') d1Time = d1Time.split('Z') d1 = datetime.strptime(d1Date + " " + d1Time, '%Y-%m-%d %H:%M:%S') return abs((datetime.now() - d1).days)
This first code block is a couple of functions that iterate through each "added at" time on the current playlist. If it has been more than 24 hours for any song, the script stops running and waits for the next cron. 24 hours is an arbitrary amount, it could be any value less than 7 days since we're only running this once a week. The check_dates function is then implemented below in the post logic to gate off when the post will make it to Reddit:
#Checks if the newest tracks have been added to the playlist. if(check_dates(trackDates)): postTitle = "This Week on The Pickle Jar: %s, %s, %s, and more!" %(top3Artists, top3Artists, top3Artists) #Checks to make sure a post of the same name has not already been posted to the subreddit (don't want to spam the subreddit!) searchResults = pickleInstance.search(postTitle,"relevance","cloudsearch", "month") alreadyPosted = False #Should only be one for submission in searchResults: alreadyPosted = True #Finally posts to reddit if all the criteria are met. if (not alreadyPosted): pickleInstance.submit(postTitle, url=results['external_urls']['spotify']) print("Posted") else: print("Already Posted") else: print("Too old")
This is also where I check for the post already existing in the subreddit. Since it requires the praw subreddit object to check, I left it in the logic itself rather than factoring the check away like the dates functions. The print text appears in the AWS console every time the cron is run so I can monitor what is happening each time the script is run.
That's basically it! The bot now lives on AWS trying to minimize the free period Amazon gives new users to post the playlist every week. The full source code can be found here as well as the source from the first post outlining the original logic. The playlist itself can be found at promotelocal.com/thepicklejar and is updated every Monday afternoon. Thanks for reading!