Collective algorithmic manipulation

My partner and I took out a Spotify Duo subscription a couple months ago. Turns out she wasn’t that happy with her recommendations being polluted by my preference for hip-hop, movie soundtracks, grime, endless repeats of Lord of the Rings and Elder Scrolls playlists, and old Drake albums. This meant that—algorithmically—I had a clean slate. A new beginning. I used it to rebuild a refined library of favoured albums and playlists. This wasn’t enough, though.

Completing a movement session in our garden the other day I caught myself attempting to push back against Spotify’s proprietary, disembodied influence on my musical consumption. I was quickly skipping songs that, objectively, I liked. I just didn’t want them to show up in a particular playlist. Similarly, I was allowing certain songs to play out in full, hoping I could encourage their ilk to continue to magic themselves into my recommendations.

These aren’t sophisticated strategies for algorithmic manipulation. However, I did wonder: collectively, have we entered an era of burgeoning algorithmic competence?

Why do I ask? Here’s an example. My partner recently sent me a TikTok of a Ricky Gervais interview after an early Golden Globe award. As a result, she was served even more Ricky Gervais content. Algorithms that influence consumption are now so responsive that adaptations like these are real-time obvious. More and more people are becoming aware of how their actions shape these algorithms.

It could be likened to the difference between first-order and second-order cybernetics—”the cybernetics of observed systems” versus “the cybernetics of observing systems”. We’re evolving towards the latter. General awareness of media consumption algorithms are altering our interactions with them.

What is most interesting is speculations about how this dynamic will play out across different group sizes and timescales. What happens when groups—I like the classification featured in Venkatesh Rao’s Unflattening Hobbes: individuals, packs, troops, tribes, imagined communities—deliberately manipulate algorithms across different timescales—minutes, hours, days, weeks, months and years?

It’s reasonable to think that this is already happening at smaller scales. It’s also reasonable to suggest that, when employed beyond some critical scale or scope, there will be surprising effects.