“Over 50% of web traffic is non-human,” declared David Bausola at NEXT Berlin last month. He went on to describe the world of bots crawling the net for information, to learn and to educate – and even mapping their travels onto the real world. His explanation of their Weavrs, web bots that travel the web based on a set of interest, location and emotional information you set them up with, was compelling — seductive even. So seductive, in fact, that I found myself fiddling with the website rather than liveblogging his session.
It looked easy. I couldn’t resist. While the talk was still going on, I created my own Weavr – an approximation of myself – and set it roaming the virtual streets of Berlin. The material it found was fascinating. It was like having my own personal little link-blogger, scouring the web for information that might interest me, without me having to lift a finger. I watched its output by Twitter and RSS over the next few days, then pulled it home to the UK with me.
Bausola suggested that these Weavrs can be used for things like market research. Set them up with a group of characteristics and see what happened. No human bias – just algorithmic results.
To me, though, they feel like the early stage of something I was promised way back in the early days of the internet, before the web, even. They are early, intelligent agents, able to seek things out, and pull together the information you want – even if you didn’t know you wanted it. I can’t wait to see what they become.