Illustration of a Pathfinder using binoculars to gaze towards hill where a new moon rises.
As the moon completes another orbit around Earth, the Pathfinders Newmoonsletter rises in your inbox to inspire collective pathfinding towards better tech futures.

We sync our monthly reflections to the lunar cycle as a reminder of our place in the Universe and a commonality we share across timezones and places we inhabit. New moon nights are dark and hence the perfect time to gaze into the stars and set new intentions.

With this Newmoonsletter, crafted around the Tethix campfire, we invite you to join other Pathfinders as we reflect on celestial movements in tech in the previous lunar cycle, water our ETHOS Gardens, and plant seeds of intentions for the new cycle that begins today.

Tethix Weather Report

🏜️ Current conditions: Bring your own water, fire practitioners are squeezing everyone and everything dry

Do not be fooled by the rainbows magically appearing on social media profiles this month. The fire practitioners of Silicon Valley are still focused on squeezing every last bit of productivity from the workers they haven’t yet fired. The race to deliver AI suggestions nobody asked for is still very much on! And if you’re getting strange answers, it’s basically your fault for asking it wrong, or not providing more free labour for higher quality training data. (See: AI engineers report burnout and rushed rollouts as ‘rat race’ to stay competitive hits tech industry, OpenAI putting ‘shiny products’ above safety, says departing researcher, Google’s “AI Overview” can give false, misleading, and dangerous answers, and Google’s weird AI answers hint at a fundamental problem)

Fire practitioners are graciously making AI golems available to millions of people around the world, and uncle Sam, Mark, and others need you. Regardless of whether you’re an A-list celebrity, an overworked tech worker, or just enjoy posting pics of your meals on Instagram, your data remains crucial for maintaining their growing AI golem army. They kindly ask you to think twice when ethical or legal concerns force them to give you the choice of opting out of AI training, the AI golems are starving! (See: Scarlett Johansson Says She Was ‘Shocked’ and ‘Angered’ Over OpenAI’s Use of a Voice That Was ‘Eerily Similar to Mine’, Slack users horrified to discover messages used for AI training, and Instagram is training AI on your data. It’s nearly impossible to opt out)

Besides, the latest great council of white men sees great strategic growth opportunities in the continued exploitation of your data. So be a good sport, and you might just get the privilege of paying extra subscriptions for AI golems who were trained on your data. AI golems will date, work, and think just for you, and yes, make fire practitioners richer. But who better to solve problems that shouldn’t exist than the people who helped them create in the first place, right? (See: Meta’s new AI council is composed entirely of white men, AI personas are the future of dating, Bumble founder says, The CEO of Zoom wants AI clones in meetings, and Microsoft’s AI chatbot will ‘recall’ everything you do on its new PCs)

Meanwhile, OpenAI’s uncle Sam thinks “we” are not worried enough about the socioeconomic impacts of the AI golems his company keeps unleashing. By “we” he probably means his fellow fire practitioners who are focusing on crushing things; everyone we talk to seems pretty concerned about the current labour market, especially in tech and creative industries. (See: Sam Altman doesn't think we are worried enough about how AI will impact the economy and Apple's 'Crush' iPad Pro ad sparks intense backlash from creatives)

We also think uncle Sam and his friends aren’t worried enough about the environmental impacts of building and maintaining such large AI golem armies. To put it in terms they do understand: “a 1C increase in global temperature leads to a 12% decline in world gross domestic product”, a recent report finds. And “the world economy is committed to an income reduction of 19% within the next 26 years independent of future emission choices”, another study warns. All the greenwashing in the world doesn’t change the fact that renewable data centers don’t magically grow in the clouds and still need water to cool the growing armies of AI golems. (See: Microsoft's carbon emissions up nearly 30% thanks to AI, Microsoft employees spent years fighting the tech giant’s oil ties. Now, they’re speaking out., The ugly truth behind ChatGPT: AI is guzzling resources at planet-eating rates, Economic damage from climate change six times worse than thought – report, and The economic commitment of climate change)

So, as the children of Silicon Valley continue playing with fire, water, and other resources to keep the machinery of AI running and putting our children into further ecological debt, we advise you to seek shade and shelter in weird communal gardens – either online or offline –, as explored in our recent Pathfinders Podcast. Uncle Sam, Mark, and others appear determined to squeeze us all dry in their pursuit of AGI and immortality, so be sure to bring your own drinking water while you still can, and share it with other Pathfinders generously. We hope the seeds of inspiration we collected in this Newmoonsletter can help you find paths to safe havens in this scorching AI landscape.

Tethix Elemental seeds

Fire seeds to stoke your Practice

While tech executives keep their heads buried in silica sand, we’re happy to report that the environmental impacts of large language models and other SALAMI are finally getting more mainstream coverage. A team within Hugging Face has even put forward an Energy Star AI Project proposal that would make it easier to select AI models based on their energy use. Of course, energy use is just the tip of the melting iceberg in the polycrisis we’re facing.

As tech bros keep dreaming of elusive AGI in the cloud, the real world keeps reminding us of how imperfect our technologies still are. The recently reported heat record in India turned out to be a sensor malfunction – though a heat record of 49.1C instead of 52.9C is still nothing to celebrate –, and a recent solar storms messed up the GPS navigation systems of self-driving tractors during peak planting season, which will cause trouble come harvest.

Perhaps it isn’t a bad idea to develop more home-cooked software that takes a local-first approach, rather than uploading all our trust and data into the cloud and disembodied AI assistants that suggest gluing cheese on your pizza and eating rocks. A good example of how technical capabilities alone don’t make up for a lack of taste – both figuratively and literally in the case of Google’s AI Overview.

Younger generations also seem to be questioning how smart our phones need to be and rediscovering the joy of playing Snake on the intentionally dumb Nokia 3210. And in the Netherlands’ Offline Club, you can now pay for the privilege of hanging out with people in person without technology getting in the way.

Air seeds to improve the flow of Collaboration

While digital technologies are supporting collaboration at a scale never seen before, the current social infrastructure of the internet feels increasingly exhausting. As we discuss in the Pathfinders Podcast, we need to intentionally nurture weird online places where we can be playful together, and hopefully create the right conditions for a scenius – collective genius – to emerge in our communities.

However, the playful weirdness that leads to creativity and innovation also needs boundaries and rules that keep everyone safe. As corporations momentarily turn their attention from greenwashing to rainbow-washing, GLAAD – the LGBTQ advocacy non-profit – is here to point out that most social media platforms still don’t get a passing grade in their Social Media Safety Index. GLAAD reminds us that while social media platforms continue profiteering from hate, the lack of safety in online spaces also leads to serious offline harms.

The move fast and break things approach is clearly breaking many things for many communities. That’s why people outside the VC-backed tech bubble tend to “think different”, as Apple once advertised: from decentralised software on the Fediverse to the ideas and practices developed by Taiwan’s Digital Minister Audrey Tang and her collaborators. The latter are now being presented in-depth in the open-source project and book Plurality, which is exploring how technology can enable diverse and open collaboration in different sectors. We can’t wait to dig into it!

And if you’re looking for additional inspiration on how to nurture healthier communities in different settings, we encourage you to explore the Community Weaving framework.

Earth seeds to ground you in Research

You’ll rarely find people developing AI talking about nurturing healthier communities, but they do often talk about aligning AI to human values, or even more vaguely about “benefiting all humanity”. An interesting goal given that recent research points to a worldwide divergence of values and makes the observation that WEIRD – Western, Educated, Industrialised, Rich, and Democratic – subjects have become even WEIRD-er over the last forty years when it comes to their social and moral values. How will AI models, developed in WEIRD contexts, be able to deal with the divergence of values remains a mystery.

There’s clearly a need to involve people with more diverse expertise (and backgrounds) in the development of AI systems. This doesn’t mean just involving people in reinforcing model outputs with human feedback, but incorporating diverse perspectives at every step of the process. As an example, the Center for Democracy & Technology’s AI Governance Lab has released practical guidelines on how to include social science and humanities expertise in the development and governance of AI systems. (Although we’d also add specifically seeking non-WEIRD sociotechnical expertise to the list.)

And while tech companies continue moving at the speed of tech, some are at least publishing better-intentioned papers, like the recently released Ethics of Advanced AI Assistants, courtesy of Google DeepMind. (Yes, owned by the same Google who shipped the AI assistant that advises people to eat rocks.) The 200+ pages long paper is practically a book, so we appreciated Andrew Maynard’s overview and the highlighted reading guide.

Speaking of papers, academic publishers are now making it clear that AI assistants do not count as peers when it comes to peer review of scientific papers – it appears AI assistants are only allowed to come after paid jobs. Good to know.

But fear not, Google DeepMind has also released a new framework that promises to warn humanity should an AI model appear to be approaching dangerous capabilities. In case you’re wondering: an increase in energy demands or resource usage is not considered a dangerous capability. Big tech companies continue to tell us that AI will help us tackle climate change, but preserving a liveable planet for our children and other species is not among the AI principles published by Google, Microsoft, or OpenAI. Go figure.

Water seeds to deepen your Reflection

This is usually the part where we try to inspire you and make you wonder about the beauty of the Universe. But this time, we wanted to reassure you that it’s ok not to be ok as we continue setting the world on fire. As we sacrifice care and depth for speed and scale. As we ignore the sounds of alarm. As life becomes stranger than fiction. It really is ok not to be ok.

But it’s also ok to take a step back when you need to. To take a walk, especially in nature, and truly think different. Not necessarily about solutions, but the big picture. The assumptions, the stories we take for granted. And perhaps get closer to embracing a planetary worldview.

Tethix Moonthly Meme

We wonder how often uncle Sam and his friends get the chance to walk in nature given the lack of (moral) imagination they display when it comes to imagining possible tech futures. We can’t take those walks for them, but instead we like to explore how we might use their AI creations to feed their imagination.

Recently, Alja decided to enlist ChatGPT’s help in imagining an alternate future in which OpenAI looks at environmental and economic forecasts and decides sustainability isn’t just something they should include on their website – though that would be a good start! –, but a guiding core guiding principle for their company. A chatbot can dream. (With the right prompts.)

Pathfinders Podcast

If you’d like to keep exploring the lunacy of tech with us, we invite you to listen and subscribe to the Pathfinders Podcast wherever you get your podcasts. The podcast is a meandering exploration inspired by the seeds planted in the Newmoonsletter at the beginning of the lunation cycle, and the paths illuminated during the Full Moon Gathering.

The question that emerged in the May Newmoonsletter and guided our discussion is: How do we nurture weird online communal gardens where we can play together? In this episode, we explore paths to online spaces where we can be weird and playful together so that we can co-create a more diverse and resilient tech future. We wonder whether the answer lies in switching from machine-scale attention marketplaces to (human) body-scale gardens that better support smaller groups, playing together, and embracing physical constraints and friction.

Take a listen and join us at the next Full Moon Gathering if you’d like to illuminate additional paths for our next episode!

Your turn, Pathfinders.

Join us for the Pathfinders Full Moon Gathering

In this lunation cycle, we’re inviting Pathfinders to gather around our virtual campfire to explore the question: How can we prompt biased AI assistants to help us think different and imagine diverse tech futures? – but it’s quite likely that our discussion will take other meandering turns as well.
So, pack your curiosity, moral imagination, and smiles, and join us around the virtual campfire for our next 🌕 Pathfinders Full Moon Gathering on Friday, June 21 at 5PM AEST / 9AM CEST, when the moon will once again be illuminated by the sun.

This is a free and casual open discussion, but please be sure to sign up so that we can lug an appropriate number of logs around the virtual campfire. And yes, friends who don’t have the attention span for the Newmoonsletter are also welcome, as long as they reserve their seat on the logs.

Keep on finding paths on your own

If you can’t make it to our Full Moon Pathfinding session, we still invite you to make your own! If anything emerges while reading this Newmoonsletter, write it down. You can keep these reflections for yourself or share them with others. If it feels right, find the Reply button – or comment on this post – and share your reflections with us. We’d love to feature Pathfinders reflections in upcoming Newmoonsletters and explore even more diverse perspectives.

And if you’ve enjoyed this Newmoonsletter or perhaps even cracked a smile, we’d appreciate it if you shared it with your friends and colleagues.

The next Newmoonsletter will rise again during the next new moon. Until then, remember to refill your water flasks, take as many fully offline walks as you can, and be mindful about the seeds of intention you plant and the stories you tell. There’s magic in both.

With 🙂 from the Tethix campfire,
Alja and Mat

website linkedin youtube 
Email Marketing Powered by MailPoet