Adding 9000 records to the shelf

Adding 9000 records to the shelf
How AMJ created automatic syncing between Mark Poppen’s online record store and his supplier.
⏳ 3 min read

The task

Funky Moose Records is Mark Poppen’s online record store. Based in Canada, Funky Moose makes its name by supporting indie artists as well as being able to ship more mainstream records. As a result, Mark now runs an e-commerce store with lots of inventory. In late 2022, he was looking at a way to make his storefront even larger.

Mark's supplier regularly updates a massive CSV file containing stock quantities for thousands of music titles, covering a wide range of genres and artists; from AC/DC to the works of Beethoven. Funky Moose used to enter records manually into its storefront, which was a time-consuming process that required sourcing various digital assets, such as title, artist, genre, description, weight, supplier price, and seller price. Mark soon realised the potential for automation in this repetitive and lengthy process – and that is where we came in.

The proposal

Every Friday, an automatic script (hosted in the cloud) would run through Mark’s supplier listings, isolate any records in stock. If the record existed on Mark’s store already, it would be marked as available. If a new entry was to be created, however, this is where the script would generate a new, live listing from scratch (with all of the relevant information).

The CSV file had two bits of information about the record. The name, and the artist: (i.e Damn the Torpedoes, Tom Petty and The Heartbreakers). Mark’s potential shoppers are interested in more than this - so where was our automation supposed to get the rest of the information?

The solution

Discogs is an online music database and marketplace that allows users to catalogue, buy, and sell music. Discogs keeps a lot of information freely accessible and, better yet, it has an API (meaning a script can talk to it.) If the script was able to find the correct record via Discogs, it could get the information we needed to make a live Shopify listing (including Album artwork) for each new record listed in the inventory.

We also needed to plan the retrieval of an album description. A little bit of block text about the album or artist makes a listing more enticing, and it’s great for Search Engine Optimisation purposes; helping Mark’s listings appear in more Google Searches. We definitely wanted to have this as part of the project.

Time for AI? Not yet.

“Hey ChatGPT – write me a description of Damn the Torpedoes by Tom Petty and the Heartbreakers.”

We were very excited about using AI to generate this content automatically, but we soon ran into a problem. AI does not raise a flag saying error when it does not know something. It is determined to produce content, whether it has the facts or not. At first, we did not know that the odd listing was being peppered with incorrect information; the sort of stuff an eagle-eyed music fan would notice. Currently, AI can be confidently wrong. Luckily, we caught on to this (otherwise Mark may have had to explain away the presence of some colourful information in his listings.)

We, therefore, turned to an older, wiser source of knowledge. Wikipedia keeps its APIs open and offers a decent search function to the programmer. Thanks to this, we configured the script to search Wikipedia for each album and extract the first paragraph of information.

[Note: A nice blend of the above two methods could be passing the Wikipedia entry to the AI and asking it to summarise – this would keep content original, yet factual.]

How long would all of this take?

Discogs, Shopify and Wikipedia are not keen on scripts hitting their APIs with multiple requests per second – and so we needed to play by the rules they set. A script instructing a modern computer to zip through all of these stock listings while simultaneously talking to Discogs would find itself muzzled fairly quickly. The code needed to be tapered to keep a safe distance from API rate limits, and as a result, the first trial of this automation would be an overnight job (with subsequent jobs expecting to be ~20 mins once the bulk of the records had been added.)

And so – once the script was tested, tested and tested again – I setup my computer to run quietly overnight while it was to add 9000 records to Mark’s Online Store. Any errors or mishaps would have been performed thousands of times over, so it was important to get it right. Thankfully, I awoke to see the records had been added with the correct information and Mark’s online store had now tripled in size. A few hours later, Mark had a sale worth 200 CAD – a title added by our automation script.

The code, now cloud-based, runs completely autonomously every Friday to keep Mark’s store fresh and updated. Any thoughts or questions on this project? Please get in touch.

Gavin Adams
Gavin Adams
< back to posts