3.2 C
London
Wednesday, December 7, 2022

GV wants to help build a disease-sniffing company, literally • londonbusinessblog.com

Must read

San Francisco votes to end policy that allows police to deploy lethal robots

SAN FRANCISCO — San Francisco regulators voted Tuesday to put the brakes on a controversial policy that would have allowed police to use robots...

American Battery Factory’s first ‘gigafactory’ centimeters closer to reality • londonbusinessblog.com

American Battery Factory's grand plan to build a bunch of, er, U.S. battery factories was shaken Tuesday when Tucson, Arizona, the company go ahead...

Valuations of fintech unicorns have fallen sharply in 2022

Fintech was hot in 2021, but looking back… maybe too hot? The industry exploded last year, seeing record investment — $132 billion worldwide, according to...

The state of capital unfolded

At Startup Daily's recent 2023 Tech Playbook From Idea to Unicorn event, our expert panel discussed the state of capital from pre-seed to IPOs. There's...
Shreya Christinahttps://londonbusinessblog.com
Shreya has been with londonbusinessblog.com for 3 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider londonbusinessblog.com team, Shreya seeks to understand an audience before creating memorable, persuasive copy.

Alex Wiltschko has what he thinks is a great idea. He wants to build a company that digitizes scent.

It’s a natural step for Wiltschko, who has a PhD in neurobiology from Harvard, where he studied how the brain processes smell. He didn’t end up in this particular group by accident, he suggests. It is thanks to a lifelong “obsession with smell and smell” that he also came to studyngside Sandeep “Bob” Datta, a Harvard professor who has long been concerned with what happens after a person’s sensory neurons pick up a scent.

The researchers sought to better understand how the human brain works — including why certain smells are linked to memories. For a long time, their field of study was also dwarfed by the attention that vision and image processing have received over the years. Then came Covid 19, and with it a lot more focus on how taste and smell are processed – and lost.

Now the race has begun to better understand and digitize and even recreate scent. Indeed, in July, a neurotech startup called Canaery $4 million in seed funding to develop an odor detection platform. Moodify, another startup working on scent digitization, has closed and $8 million round of funding last year, including from Procter & Gamble.

If Datta told Harvard Magazine late last year: “At the moment there is a lot of intense interest in smell from doctors and from the many millions of patients whose sense of smell is impaired. And it’s really highlighted how little we know about all aspects of our sense of smell.”

For the time being, Wiltschko belongs to the select group that sees opportunities in solving these unknowns. His old employer, Google, sees it too. After logging into Google AI for nearly six years while working on his PhD, Wiltschko just became an entrepreneur-in-residence (EIR) at GV, the venture arm of the search giant, where he more specifically hopes to build a business. that can identify diseases more quickly based on specific odor molecules.

It’s a meaningful vote of confidence from GV, which has appointed only five life sciences-focused EIRs in its 12-year history, including the founders of Flatiron Health (which sold to pharmaceutical giant Roche in 2018 for $1.9 billion). ; the gene editing company Verve Therapeutics, which went public last year; and Rome Therapeutics, a startup developing therapies for cancer and autoimmune diseases by focusing on parts of DNA that have been largely overlooked by researchers, it says. (Rome at least raised) $127 million over two funding rounds to date.)

The big question, of course, is what will come of the effort. To find out more about how he approaches his mission, we spoke to Wiltschko yesterday, who was kind but also reluctant to share too much. Our conversation has been lightly edited for length and clarity.

TC: I’ve never come across anything like this before. Are you trying to better understand how to build neural networks based on how people process and compartmentalize information about smells?

AW: Taking a step back, every time computers got a new “feel” like seeing or hearing, completely changed society for the better, didn’t it? When we first learned how to store visual images in the 19th century, and eventually on computers in the 20th century, we were suddenly able to do things like take X-rays. We could do things like save memories [of] the visual world. And we didn’t need painters to do it, anyone could do it. We did it again to hear; we [could make] music [captured in one location] available to the masses.

But computers can’t smell. They have no ability to detect the chemical world [so] we can’t store the really powerful memories we associate with smell, like the smell of my grandmother’s house. It’s just gone. It only lives in my mind. The smells of people I love, and of places I’ve been, are completely fleeting today.

We [also] know that diseases have a smell. We know that different states of well-being and health have a smell. Plants, when they are sick or healthy, smell different. The amount of information out there in the world that we could potentially take action on to extend our lives, make our lives more joyful, grow more food – that can really only happen in living beings in living noses – if we could automation, we can have a huge and positive impact on society.

What applications do you have in mind?

I think the North Star for me — and I don’t know how long it’s going to take to get there — is that we’re going to be able to smell disease earlier to detect disease earlier than we currently can. There are many stories out there – many anecdotes and various articles – and research has built up a kind of image for me that we can smell Parkinson’s much earlier than we could otherwise detect it; we can smell disease much, much sooner. And if we could build devices that can convert that information into digital representations, we could potentially catch diseases earlier and learn how to better treat them.

How do we know that we can detect Parkinson’s earlier than otherwise through smell?

There is no single slam dunk that can help us detect disease earlier, but there are many stories that all have their strengths and weaknesses that together provide a clearer picture. For Parkinson’s, there’s a nurse who first reported that she could smell Parkinson’s in her husband before he actually developed it, and they put it to the test. They collected T-shirts that men had worn — half with Parkinson’s, half without Parkinson’s — and said, ‘Hey, can you see which of these T-shirts were worn by a person with this disease?’ She got almost all of them except one, and she said: [to the researchers]’Actually I think’ you are wrong.’ And that person eventually got Parkinson’s disease.

They continued with the story, trying to figure out exactly what she smelled. And researchers found the exact material expelled from the body: this waxy substance called sebum that’s secreted by cells on your back. And they found the exact molecules she smelled. But it was her nose, it was her ability to take an olfactory picture of the world and turn it into an idea of ​​whether someone was sick or not that preceded all that.

If we can eventually digitize scent as you suggest, are you concerned that scents can be manipulated for certain resources – perhaps to trick people into thinking they’re in danger when they aren’t, or safe when they’re in danger? There is a lot of good that comes from new technologies, but also second-order effects that we don’t always think about.

It’s definitely important when a new technology area is being developed to think about those things, sure. An area that I think is in its infancy, it’s not clear at all where it might go, but at least I’m personally relaxed by certain scents. I do not know why. And so I think there’s a lot for us to learn in that space.

Have you studied the effects of COVID-19 on the sense of smell?

I personally? No. But my former mentors certainly looked at this very, very closely. We started a lot of this research on how people think things smell when COVID just started. And we had to be very careful because people would lose their sense of smell if they got COVID. And when you study how people think things smell, you have to be very, very careful when people suddenly become anosmic — that’s the term for the loss of the sense of smell. And so we had to develop all kinds of new checks and balances in our research protocols.

And now you have joined GV with the idea of ​​developing a business. What types of resources are available to you? Are you going to team up with some of your former Harvard Medical School colleagues? I assume you need access to many datasets.

What’s great about starting to work on this idea today, compared to maybe 10 years ago, is that the ecosystem of people working on smell or smell has grown dramatically. And I think the attention paid to our sense of smell – because we now understand how important it is when we lose it [has fostered a] much richer ecosystem of people working and thinking about the sense of smell.

Are there already companies that are trying to do exactly what you hope to do?

It’s a vibrant ecosystem and there are a lot of people working on different parts of it. What’s really great about joining GV as an entrepreneur in residence is that I can take the broad view and think about how to make the most impact in the space in digital olfaction.

Can you tell us more about that path? You mentioned Parkinson’s. Is the idea to first focus on being able to diagnose Parkinson’s and build around that, or do you opt for a multi-pronged approach?

Going back to the other senses, there are only 1,000 things you could do if you could take the visual world or if you could take all the sound and store it in a computer and analyze it. Those are the two sides of the sword – there are so many opportunities, so many places to start, [but] on the other hand, you have to concentrate, so I spend a lot of my time thinking about that, what is the right way to go specifically towards our North Star, which improves the well-being and length of human life?

Humans offer wildly different timelines when it comes to when we might see artificial general intelligence. Some say it will be in 5 years. Some say 10 years. Some say 500. What’s your best guess when it comes to how far we are from digitizing the sense of smell?

It may have taken 100 years to digitize our eyesight. I think we can compress digitizing our sense of smell to a fraction of that. It won’t be easy. It will take a lot of work. But now is a good time to start.

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article

San Francisco votes to end policy that allows police to deploy lethal robots

SAN FRANCISCO — San Francisco regulators voted Tuesday to put the brakes on a controversial policy that would have allowed police to use robots...

American Battery Factory’s first ‘gigafactory’ centimeters closer to reality • londonbusinessblog.com

American Battery Factory's grand plan to build a bunch of, er, U.S. battery factories was shaken Tuesday when Tucson, Arizona, the company go ahead...

Valuations of fintech unicorns have fallen sharply in 2022

Fintech was hot in 2021, but looking back… maybe too hot? The industry exploded last year, seeing record investment — $132 billion worldwide, according to...

The state of capital unfolded

At Startup Daily's recent 2023 Tech Playbook From Idea to Unicorn event, our expert panel discussed the state of capital from pre-seed to IPOs. There's...

Apple will use American-made chips from TSMC’s new Phoenix factory

Apple plans to use US-made processors after opening a state-of-the-art new chip factory in Phoenix, Arizona.For the plant's customers, including AMD and NVIDIA, the...