Pasteur’s Quadrant and The Next Wave in Materials; DARPA’s Quantum Quest; The Robots Artists Are Not Coming;
Louis Pasteur laid the foundation of microbiology more than a century ago. He did so by means of fundamental yet use-inspired studies, i.e. all of his work was driven be the understanding of the fundamentals of microbiology, while addressing use-inspired studies. This approach led Donald Stokes, in his 1997 book, to create what is now known as the Pasteur’s Quadrant, the place where the quest for fundamental understanding (i.e. basic research) is combined with considerations of use. This is the “sweet spot” where science and technology provide the most impact.
While till not long ago the players in the matrix drawn by Stokes were mostly universities, research institutes and corporate research labs, today a key role is played by startups, which are more and more occupying Pasteur’s quadrant, like Commonwealth Fusion Systems, which I mentioned last week and is working on making nuclear fusion possible and economically viable.
There are many reasons why startups are occupying that quadrant today, something unthinkable a decade or so ago. One of these reasons is the convergence of three different forces, which are reinforcing each other:
- We are moving more and more from understanding nature to engineering nature and change our approach to it (gene reading and engineering, DNA synthesis, nanotechnology, material science…)
- Computing power has moved to a different level through progress on AI, cloud and now quantum computing
- Robotics, drones, 3D printing, AR/VR have provided a set of tools that have been key in advancing the engineering of nature leveraging the new computing power
I have written already about the potential of deeptech and in particular of synthetic biology. In this week’s edition we can see through two articles how the pattern that has brought to incredible advances in synbio is now migrating to non-organic materials.
- A team at the University of Toronto has developed new catalysts to convert CO2 into ethylene, leveraging AI and an open database with 125K inorganic compounds. Similarly to what happened in synbio, Kebotix, a startup, is now working to bring process automation to materials development, the same way automation created an incredible momentum for synbio allowing high-throughput analysis.
- A Korean team, leveraging nanotechnology, has developed a technique for commercially printing metamaterials- "substances made from artificial atoms that do not exist in nature but [can] freely control the properties of light." Such materials could lead to AR/VR devices at 1/100th of the cost, and 1/10000th of the thickness.
These two examples are only two of the many innovations happening on the materials front, where a wave similar to the one in synbio (in terms of potential) is in the make.
- On the computing front, while accurate and stable full quantum computers are still far out, DARPA is looking at hybrid systems (quantum and classical computing) as an intermediate step that might provide and additional push to both synbio and materials science.
And to remain in the computational sphere, two articles this week support a view of AI and Human-Machine relationship I deeply believe in:
- AI and robots are not going to become a source of art themselves, i.e. artists. Instead, they represent a new set of tools available for artists to operate with. They are their new brushes and chisels. See for instance the beautiful work produced by the artist Sougwen.
- In fact I very much agree with Ben Shneiderman that we should move from “one-dimensional” machine automation to a two-dimension alternative that allows for both high levels of machine automation and human control
And if you want to feel a bit nostalgic (or discover it), about HTML only websites of the early web, you might want to look at Parimal Satyal’s latest blog post…
...along with a number of other enlightening reads.
Human and Machine
A Fight for the Soul of Machine Learning
From extreme bias in ML models to creepy surveillance AI startups, progress towards diversity and inclusion in tech has suffered setbacks in recent years. AI's "white guy" problem has been criticized by academics and the public. In 2015, Google Photos users discovered it was classifying images of black people as gorillas. Nikon's camera software also misread images of Asian people as blinking. Recent reporting about surveillance AI startups Banjo and Clearviewexposed links to white supremacists. On top of these controversies, Google has reportedly scaled back its internal diversity and inclusion training programs to avoid backlash from conservatives.
VentureBeat highlights the bright and ugly sides of AI in diversity - covering recent reports and events that have raised questions about the industry's commitment to inclusion - and also more positive developments such as more diverse panels at AI conferences.
A Case for Cooperation Between Machines and Humans
The hype of fully autonomous cars has been floating around for a few years now, as has the skepticism. Critics cite safety and jobs as the top reasons why the world needs to rethink AVs. Though it has been shown otherwise, the popular belief is that automation will take jobs and drive down wages. Recent failures of autonomous control systems such as Boeing’s MCAS , Uber’s fatal crash, and the 2017 incident where a Tesla on Autopilot crashed into a stationary fire engine at 65 mph have given AV safety concerns even more relevance.
Ben Shneiderman, professor at the University of Maryland and human-computer interaction expert, believes that robots should be designed to work with humans rather than replace them. He argues that when humans can’t control systems, designers risk creating unsafe machines and absolving humans of ethical responsibility for their actions. Shneiderman challenges the engineering community to drop the current “one-dimensional” machine automation for a two-dimension alternative that allows for both high levels of machine automation and human control.
Creativity
The Robot Artists Aren't Coming
AI can generate images of human faces that are nearly impossible to differentiate from real photos, OpenAI's MuseNet can create music at the click of a button, and there’s AI that generates entire articles and dialogues. Despite fears that AI will take artists' jobs (or commoditize creativity), Ahmed Elgammal, director at the Art and Artificial Intelligence Lab at Rutgers University, says AI is not a substitute for human creativity: "without a human artist behind the machine, A.I. can do little more than play with form" - a variation of the human as a cook/curator argument we'll see in the Wired piece in the Materials section.
Elgammal sees parallels between the integration of the camera and AI in art: "cameras didn't kill art; they simply provided people with another way to express themselves visually." Similarly, he notes, AI could assist art in different ways - such as distinguishing authentic pieces from forged ones and uncovering similar influences among artworks from different periods.
An homage to the creativity of the early HTML-only internet, a history of GeoCities-era personal expression, and a straight-up time-capsule, Parimal Satyal's latest blog explores the 90s small web - before sleek, SEO-optimized (and often ad-bloated) sites became a standard. Satyal's excitement and nostalgia are palpable. Satyal is even more excited about "restorative" projects like Wiby ("a search engine for old-school, interesting and informative webpages"), Neocities ("a modern web host that lets anyone create a basic website for free"), and Curlie ("the largest human-edited directory of the Web").
Life Sciences
Wearable Tech Can Spot Coronavirus Symptoms Before You Even Realize You're Sick
Since March, half a dozen (non-clinical) studies have explored whether wearable data could predict COVID-19 symptoms, with Fitbit joining the effort last week. Other devices used for seeking out patterns in user sensory data include Oura and Apple Watch (via a Scripps Research study). Washington Post dives into some of "the first evidence that the idea works."
"On Thursday, researchers at WVU's Rockefeller Neuroscience Institute reported that Oura ring data, combined with an app to measure cognition and other symptoms, can predict up to three days in advance when people will register a fever, coughing or shortness of breath."
"Researchers at Stanford University studying changes in heart rate from Fitbits tell me they've been able to detect the coronavirus before or at the time of diagnosis in 11 of 14 confirmed patients they've studied."
Spaces
How Smart City Planning Could Slow Future Pandemics
As the West African Ebola outbreak was subsiding, Bill Gates warned that the world was not ready for the next pandemic. COVID-19 has disproportionately affected low-income urban areas in the US, partly because "many municipalities weren't built with highly transmissible infectious disease—or human health—in front of mind." Wired explores experts' smart city strategies, which aim to ward off future outbreaks through means ranging from inclusivity to telehealth.
Institute of Urban and Regional Development director Jason Corburn says that unhealthy cities are an intricate problem that no short-term fix can solve: "We don't need a solution - we need to have a process that's much more open and inclusive and will center the people who have been marginalized." Urban planning professor Richard Matthew proposes reducing the manufacturing density - including people - while city and regional planning professor James Spencer roots for cheap telemedicine stations throughout developing cities.
The State of the Self-Driving Car Race 2020
When DARPA first organized a race across the Mojave Desert between 15 self-driving cars in the 2004 Grand Challenge, there was hope that fully autonomous vehicles would eventually follow. Sixteen years later, only Waymo's driverless cars are taking passengers - and it's only in a few suburbs in Phoenix. Automakers such as Tesla and GM had promised to deploy self-driving cars by this year, but these dates have been pushed back. The slowdown can be attributed to technical challenges and licensing delays - for instance, about 21 states have not passed AV laws - among many other issues and setbacks.
Against the backdrop of these challenges, the pandemic is likely to uproot many players as companies cut back on non-essential expenses and smaller startups drop out - "A lot of these efforts may just cease to exist without so much as a Medium post to flag it. I anticipate we're going to see a pretty big winnowing," says Gartner VP Mike Ramsey. Bloomberg Hyperdrive lists the leaders (Waymo, Argo), posses (Baidu, Volvo), and rogues (Pony.AI, Tesla) of the industry.
Materials
This Lab 'Cooks' With AI to Make New Materials
Thomas Edison and his team tested over 3,000 materials between 1878 and 1880 to create the first light bulb - but they never tried tungsten, which turned out to be the reigning filament throughout the 1900s. Material discovery is a laborious task: "it can take two decades for scientists to discover a material and fine-tune it well enough for commercialization," notes Wired. But thanks to AI and supercomputers, the discovery process is slowly becoming faster and more comprehensive.
In 2017, for example, HRL Laboratories used AI to help invent a powder alloy for 3D-printing airplane parts; Massachusetts-based startup Kebotix is moving towards a "self-driving" laboratory - using computer simulations to suggest and test recipes for new materials; and a team at the University of Toronto used AI and supercomputing techniques to accelerate the conversion of CO2 to ethylene (a chemical used to make plastic).
Still, human input is indispensable: "even with stand mixers, instapots, and bread machines, the kitchen still needs a cook." Or, a curator - Kristin Persson, a physicist at Lawrence Berkeley National Laboratory says that algorithms only work up to a point: "because you still need to test all the ideas at the workbench."
Pandemic Will Jumpstart Automated Manufacturing – IFR
A study co-authored by MIT economist Daron Acemoglu found that one robot replaces 3.3 jobs on average, while another by Oxford Economics indicates that "tens of millions of existing jobs will be lost." Recent statistics, however, suggest that automation will create more jobs. In the five years leading to 2018, the auto industry was the biggest adopter of industrial robots – yet its employment increased by 22% from 824,400 to 1,005,000, according to the Bureau of Labor Statistics.
Roboticists and tech founders are, for the most part, coming to same conclusion - International Federation of Roboticspresident Milton Guerry says automation will boost both productivity and jobs: "productivity increases and competitive advantages of automation don't replace jobs – they will automate tasks, augment jobs and create new ones."
Processors
DARPA's Quantum Quest May Leapfrog Modern Computers
Hybrid quantum and classical systems may one day efficiently solve problems like strategic asset deployment, global supply chains, battlefield logistics, and other combinatorial optimization problems too complex for current systems to solve. "Anyone who relies on getting something from point A to B or figuring out how to use limited resources is grappling with combinatorial optimization problems, whether they know it or not," says Creston Herold, a research scientist at the Georgia Tech Research Institute. Herold is part of a team working for DARPA's Optimization with Noisy Intermediate-Scale Quantum program (ONISQ), which aims to utilize quantum computing before operable quantum systems are available.
"ONISQ seeks to demonstrate the quantitative advantage of quantum information processing by leapfrogging the performance of classical-only systems in solving optimization challenges, according to DARPA's ONISQ website. Perfectly stable and accurate quantum processors may be decades away, but successful hybrid systems would be a major breakthrough."
Interfaces
Facebook Teases a Vision of Remote Work Using Virtual and Augmented Reality
Andrew Bosworth, Facebook's head of AR and VR, recently posted an eight-second clip on Twitter, showcasing real footage of a prototype Oculus headset going through an impressive remote-work POC (you can click the image below to see the video on Twitter). The POC features fully-virtual work displays imposed onto the real world, a "mix of AR and VR - what the tech industry calls mixed reality - that uses passthrough to show you your keyboard while you type." In a recent blog post, the company elaborated on its vision of XR and remote work:
"In the future, we could create a super-powered augmented workspace with multiple customizable screens in VR, unbounded from the limits of physical monitors. It would leverage technologies like Passthrough to create a mixed reality productivity experience that allows people to switch between real and virtual worlds at any time, improving spatial awareness while offering the flexibility we’re accustomed to with laptops and other common devices."
VR and AR Devices at 1/100 the Cost and 1/10,000 the Thickness in the Works
Researchers at Pohang University of Science and Technology and Korea University collaborated on a technique for commercially printing metamaterials - "substances made from artificial atoms that do not exist in nature but [can] freely control the properties of light." Using the technique, the team created an ultrathin metalens 100x faster than via conventional techniques (for more on metalenses, see the video). The metalens is 100x thinner than a human hair, and one-thousandth of the thickness of heavy glass or plastic lenses.
According to lead researcher Junsuk Rho, "these lenses can not only make the existing thick, large VR and AR lenses or glasses dramatically lighter and smaller, but can also be applied to curved or flexible panels, which facilitates the use of metamaterials in large omnidirectional invisible cloaks or in curved or bendable wearable devices at a fraction of the cost."
Deep Tech for Human Health & Performance ◆ Open Strategy-Execution ◆ Demand-Side Innovation ◆ Translational Research
4y"We are moving more and more from understanding nature to engineering nature and change our approach to it." This transformation can be accelerated by inquiry into the ways we have been engineering nature from the dawn of agriculture, livestock management, and the built environment, most notably in the effects of human activity on climate change and ecological pollution. In this sense, the transformation can be characterized more accurately as change from engineering nature badly to doing it well.