Category: What I’m Reading

  • CAR-T Bests Solid Tumors

    The promise of immune cell therapies is to direct the incredible sophistication of the immune system to the medical targets we want to hit. In the case of CAR-T cells, we take a patient’s own immune T-cells and “program” them through genetic modification to go after cancer cells. While the process of the genetic modification to create those cells is still incredibly expensive and challenging, we’ve seen amazing progress with CAR-T in liquid tumors (e.g., leukemias, lymphomas, etc).

    But, when it comes to solid tumors, it’s been far more challenging. Enter this Phase II clinical trial from China (summarized in Nature News). The researchers performed a random controlled trial on 266 patients with gastric or gastro-esophageal cancer who resisted previous treatment and assigned 2/3 to receive CAR-T or best-medical-care (the control) otherwise. The results (see the survival curve below) are impressive — while the median progression-free survival is only about 1.5 months different, it’s very clear that by month 8 there are no progression-free patients in the control group but something like ~25% of the CAR-T group.

    The side effect profile is still challenging (with 99% of patients in CAR-T group experiencing moderately severe side effects) but this is (sadly) to be expected with CAR-T treatments.

    While it remains to be seen how this scales up in a Phase III study with a larger population, this is incredibly promising finding — giving clinicians a new tool in their arsenal for dealing with a wider range of cancer targets as well as suggesting that cell therapies still have more tricks up their sleeves


  • Great Expectations

    This is an old piece from Morgan Housel from May 2023. It highlights how optimistic expectations can serve as a “debt” that needs to be “paid off”.

    To illustrate this, he gives a fascinating example — the Japanese stock market. From 1965 to 2022, both the Japanese stock market and the S&P500 (a basket of mostly American large companies) had similar returns. As most people know, Japan has had a miserable 3 “lost decades” of growth and stock performance. But Housel presents this fact in an interesting light: it wasn’t that Japan did poorly, it just did all of its growth in a 25 year run between 1965-1990 and then spent the following two decades “paying off” that “expectations debt”.

    Housel concludes, as he oftentimes does, with wisdom for all of us: “An asset you don’t deserve can quickly become a liability … reality eventually catches up, and demands repayment in equal proportion to your delusions – plus interest”.

    Manage your great expectations.


    Expectations Debt
    Morgan Housel

  • Why Self-Hosting is Having a Moment

    I am a selfhoster (as anyone who’s read many of my more recent blog posts knows). I’m also a fan of the selfh.st site (which documents a lot of news & relevant interviews from the self-host world) so I was delighted to see the owner of selfh.st get interviewed in Ars Technica.

    Nothing earth-shattering but I appreciated (and agreed with) his breakdown of why self-hosting is flourishing today (excerpt below). For me, personally, the ease with which Docker makes setting up selfhosted services and the low cost of storage and mini-PCs turned this from an impractical idea into one that I’ve come to rely on for my own “personal tech stack”.


  • I’m just a Medical Guideline

    Medical Guidelines are incredibly important — they impact everything from your doctor’s recommendations and insurance coverage to the medications your insurance covers — but are somewhat shrouded in mystery.

    This piece from Emily Oster’s ParentData is a good overview of what they are (and aren’t) — and give a pretty good explanation of why a headline from the popular press is probably not capturing the nuance and review of clinical evidence that goes into them.

    (and yes, that title is a Schoolhouse Rock reference)


  • On (Stock Market) Bubbles

    I spotted this memo from Oaktree Capital founder Howard Marks and thought it was a sobering and grounded take on what makes a stock market bubble and reasons to be alarmed about the current concentration of market capitalization in the so-called “Magnificent Seven” and how eerily similar this was to the “Nifty Fifty” or the “Dot Com Bubble” eras of irrational exuberance. Whether you agree with him or not, it’s a worthwhile piece of wisdom to remember.

    This graph that Marks borrowed from JP Morgan is also quite intriguing (terrifying?)


    On Bubble Watch
    Howard Marks

  • Pivoting from Consumer to Utility: Span

    As a Span customer, I’ve always appreciated their vision: to make home electrification cleaner, simpler, and more efficient through beautifully designed, tech-enabled electrical panels. But, let’s be honest, selling a product like this directly to consumers is tough. Electrical panels are not top-of-mind for most people until there’s a problem — and explaining the value proposition of “a smarter electrical panel” to justify the high price tag can be a real challenge. That’s why I’m unsurprised by their recent shift in strategy towards utilities.

    This pivot to partnering with utility companies makes a lot of sense. Instead of trying to convince individual homeowners to upgrade, Span can now work directly with those who can impact community-scale electrification.

    While the value proposition of avoiding costly service upgrades is undeniably beneficial for utilities, understanding precisely how that translates into financial savings for the utilities needs much more nuance. That, along with the fact that rebates & policy will vary wildly by locality, raises many uncertainties about pricing strategy (not to mention that there are other, larger smart electric panel companies like Leviton and Schneider Electric, albeit with less functional and less well-designed offerings).

    I wish the company well. We need better electrical infrastructure in the US (and especially California, where I live) and one way to achieve that is for companies like Span to find a successful path to market.


    Span’s quiet turn toward utilities
    Lisa Martine Jenkins | Latitude Media

  • RISC-V in Computers

    One of the most exciting technological developments from the semiconductor side of things is the rapid development of the ecosystem around the open-source RISC-V instruction set architecture (ISA). One landmark in its rise is that the architecture appears to be moving beyond just behind-the-scenes projects to challenging Intel/AMD’s x86 architecture and ARM (used by Apple and Qualcomm) in customer-facing applications.

    This article highlights this crucial development by reporting on early adopters embracing RISC-V to move into higher-end devices like laptops. Companies like Framework and DeepComputing have just launched or are planning to launch RISC-V laptops. While RISC-V-powered hardware still have a steep mountain to climb of software and performance challenges (as evidenced by the amount of time it’s taken for the ARM ecosystem to be credible in PCs), Intel’s recent setbacks and ARM’s legal battles with Qualcomm over licensing (pretty much guaranteeing every company that uses ARM is now going to work on RISC-V) coupled with the open source nature of RISC-V potentially allowing for a lot more innovation in form factors and functionality may have created an opening here for enterprising companies willing to make the investment.


    This Year, RISC-V Laptops Really Arrive
    Matthew S. Smith | IEEE Spectrum

  • Revenge of the Plug-In Hybrid

    While growing vehicle electrification is inevitable, it always surprised me that US automakers would drop past plug-in hybrid (PHEV) technology to only embrace all-electric. While many have attacked Toyota’s more deliberate “slow-and-steady” approach to vehicle electrification, it always seemed to me that, until we had broadly available, high quality electric vehicle charging infrastructure and until all-electric vehicles were broadly available at the price point of a non-luxury family car (i.e. a Camry or RAV4), that electric vehicles were going to be more of a upper middle class/wealthy phenomena. Considering their success in the Chinese automotive market (and growing faster than all-electric vehicles!), it always felt odd that the category wouldn’t make its way into the US market as the natural next step in vehicle electrification.

    It sounds like Dodge Ram (a division of Stellantis) agrees. It intends to delay its all-electric version of its Ram 1500 in favor of starting with its extended range plug-in hybrid version, the Ramcharger. Extended range electric vehicles (EREVs) are plug-in hybrids similar to the Chevy Volt. They employ an electric powertrain and a generator which can run on gasoline to supply additional range when the battery runs low.

    Interestingly, Nissan, Hyundai, Mazda, and GM’s Buick have made similar announcements as well.

    While it still remains to be seen how well these EREVs/PHEVs are adopted — the price points that are being discussed still feel too high to me — seeing broader adoption of plug-in hybrid technology (supplemented with gas-powered range extension) feels like the natural next step on our path to vehicle electrification.


    As EV Sales Stall, Plug-In Hybrids Get a Reboot
    Lawrence Ulrich | IEEE Spectrum

  • Decarbonizing Shipping with Wind

    The shipping industry is known for being fairly dirty environmentally due largely to the fact that the most common fuel used in shipping — bunker fuel — contributes both to carbon emissions, significant air pollution, and water pollution (from spills and due to the common practice of dumping the byproduct of sulphur scrubbing to curtail air pollution).

    While much of the effort to green shipping has focused on the use of alternative fuels like hydrogen, ammonia and methanol as replacements for bunker fuel, I recently saw an article on the use of automated & highly durable sail technology to le ships leverage wind as a means to reduce fuel consumption.

    I don’t have any inside information on what the cost / speed tradeoffs are for the technology, nor whether or not there’s a credible path to scaling to handle the massive container ships that dominate global shipping, but it’s a fascinating technology vector, and a direct result of the growing realization by the shipping industry that it needs to green itself.


  • Google’s Quantum Error Correction Breakthrough

    One of the most exciting areas of technology development, but that doesn’t get a ton of mainstream media coverage, is the race to build a working quantum computer that exhibits “below threshold quantum computing” — the ability to do calculations utilizing quantum mechanics accurately.

    One of the key limitations to achieving this has been the sensitivity of quantum computing systems — in particular the qubits that capture the superposition of multiple states that allow quantum computers to exploit quantum mechanics for computation — to the world around them. Imagine if your computer’s accuracy would change every time someone walked in the room — even if it was capable of amazing things, it would not be especially practical. As a result, much research to date has been around novel ways of creating physical systems that can protect these quantum states.

    Google has (in a pre-print in Nature) demonstrated their new Willow quantum computing chip which demonstrates a quantum error correction method that spreads the quantum state information of a single “logical” qubit across multiple entangled “physical” qubits to create a more robust system. Beyond proving that their quantum error correction method worked, what is most remarkable to me, is that they’re able to extrapolate a scaling law for their error correction — a way of guessing how much better their system is at avoiding loss of quantum state as they increase the number of physical qubits per logical qubit — which could suggest a “scale up” path towards building functional, practical quantum computers.

    I will confess that quantum mechanics was never my strong suit (beyond needing it for a class on statistical mechanics eons ago in college), and my understanding of the core physics underlying what they’ve done in the paper is limited, but this is an incredibly exciting feat on our way towards practical quantum computing systems!


  • Cynefin

    I had never heard of this framework for thinking about how to address problems before. Shout-out to my friend Chris Yiu and his new Substack Secret Weapon about improving productivity for teaching me about this. It’s surprisingly insightful about when to think about something as a process problem vs an expertise problem vs experimentation vs direction.


    Problems come in many forms
    Chris Yiu | Secret Weapon

  • The Hits Business — Games Edition

    The best return on investment in terms of hours of deep engagement per dollar in entertainment is with games. When done right, they blend stunning visuals and sounds, earworm-like musical scores, compelling story and acting, and a sense of progression that are second to none.

    Case in point: I bought the complete edition of the award-winning The Witcher 3: Wild Hunt for $10 during a Steam sale in 2021. According to Steam, I’ve logged over 200 hours (I had to doublecheck that number!) playing the game, between two playthroughs and the amazing expansions Hearts of Stone and Blood and Wine — an amazing 20 hours/dollar spent. Even paying full freight (as of this writing, the complete edition including both expansions costs $50), that would still be a remarkable 4 hours/dollar. Compare that with the price of admission to a movie or theater or concert.

    The Witcher 3 has now surpassed 50 million sales — comfortably earning over $1 billion in revenue which is an amazing feat for any media property.

    But as amazing and as lucrative as these games can be, these games cannot escape the cruel hit-driven basis of their industry, where a small number of games generate the majority of financial returns. This has resulted in studios chasing ever more expensive games with familiar intellectual property (i.e. Star Wars) that has, to many game players, cut the soul from the games and has led to financial instability in even popular game studios.

    This article from IGN summarizes the state of the industry well — with so-called AAA games now costing $200 million to create, not to mention $100’s of millions to market, more and more studios have to wind down as few games can generate enough revenue to cover the cost of development and marketing.

    The article predicts — and I hope it’s right — that the games industry will learn some lessons that many studios in Hollywood/the film industry have been forced to: embrace more small budget games to experiment with new forms and IP. Blockbusters will have their place but going all-in on blockbusters is a recipe for a hollowing out of the industry and a cutting off of the creativity that it needs.

    Or, as the author so nicely puts it: “Maybe studios can remember that we used to play video games because they were fun – not because of their bigger-than-last-year maps carpeted by denser, higher-resolution grass that you walk across to finish another piece of side content that pushes you one digit closer to 100% completion.”


  • The Challenge of Capacity

    The rise of Asia as a force to be reckoned with in large scale manufacturing of critical components like batteries, solar panels, pharmaceuticals, chemicals, and semiconductors has left US and European governments seeking to catch up with a bit of a dilemma.

    These activities largely moved to Asia because financially-motivated management teams in the West (correctly) recognized that:

    • they were low return in a conventional financial sense (require tremendous investment and maintenance)
    • most of these had a heavy labor component (and higher wages in the US/European meant US/European firms were at a cost disadvantage)
    • these activities tend to benefit from economies of scale and regional industrial ecosystems, so it makes sense for an industry to have fewer and larger suppliers
    • much of the value was concentrated in design and customer relationship, activities the Western companies would retain

    What the companies failed to take into account was the speed at which Asian companies like WuXi, TSMC, Samsung, LG, CATL, Trina, Tongwei, and many others would consolidate (usually with government support), ultimately “graduating” into dominant positions with real market leverage and with the profitability to invest into the higher value activities that were previously the sole domain of Western industry.

    Now, scrambling to reposition themselves closer to the forefront in some of these critical industries, these governments have tried to kickstart domestic efforts, only to face the economic realities that led to the outsourcing to begin with.

    Northvolt, a major European effort to produce advanced batteries in Europe, is one example of this. Despite raising tremendous private capital and securing European government support, the company filed for bankruptcy a few days ago.

    While much hand-wringing is happening in climate-tech circles, I take a different view: this should really not come as a surprise. Battery manufacturing (like semiconductor, solar, pharmaceutical, etc) requires huge amounts of capital and painstaking trial-and-error to perfect operations, just to produce products that are steadily dropping in price over the long-term. It’s fundamentally a difficult and not-very-rewarding endeavor. And it’s for that reason that the West “gave up” on these years ago.

    But if US and European industrial policy is to be taken seriously here, the respective governments need to internalize that reality and be committed for the long haul. The idea that what these Asian companies are doing is “easily replicated” is simply not true, and the question is not if but when will the next recipient of government support fall into dire straits.


  • A Visual Timeline of Human Migration

    Beautiful map laying out when humans settled different parts of the world (from 2013, National Geographic’s Out of Eden project)


    A Walk Through Time
    Jeff Blossom | National Geographic

  • Not your grandma’s geothermal energy

    The pursuit of carbon-free energy has largely leaned on intermittent sources of energy — like wind and solar; and sources that require a great deal of initial investment — like hydroelectric (which requires elevated bodies of water and dams) and nuclear (which require you to set up a reactor).

    The theoretical beauty of geothermal power is that, if you dig deep enough, virtually everywhere on planet earth is hot enough to melt rock (thanks to the nuclear reactions that heat up the inside of the earth). But, until recently, geothermal has been limited to regions of Earth where well-formed geologic formations can deliver predictable steam without excessive engineering.

    But, ironically, it is the fracking boom, which has helped the oil & gas industries get access to new sources of carbon-producing energy, which may help us tap geothermal power in more places. As fracking and oil & gas exploration has led to a revolution in our ability to precisely drill deep underground and push & pull fluids, it also presents the ability for us to tap more geothermal power than ever before. This has led to the rise of enhanced geothermal, the process by which we inject water deep underground to heat, and leverage the steam produced to generate electricity. Studies suggest the resource is particularly rich and accessible in the Southwest of the United States (see map below) and could be an extra tool in our portfolio to green energy consumption.

    (Source: Figure 5 from NREL study on enhanced geothermal from Jan 2023)

    While there is a great deal of uncertainty around how much this will cost and just what it will take (not to mention the seismic risks that have plagued some fracking efforts), the hunger for more data center capacity and the desire to power this with clean electricity has helped startups like Fervo Energy and Sage Geosystems fund projects to explore.


  • Who needs humans? Lab of AIs designs valid COVID-binding proteins

    A recent preprint from Stanford has demonstrated something remarkable: AI agents working together as a team solving a complex scientific challenge.

    While much of the AI discourse focuses on how individual large language models (LLMs) compare to humans, much of human work today is a team effort, and the right question is less “can this LLM do better than a single human on a task” and more “what is the best team-up of AI and human to achieve a goal?” What is fascinating about this paper is that it looks at it from the perspective of “what can a team of AI agents achieve?”

    The researchers tackled an ambitious goal: designing improved COVID-binding proteins for potential diagnostic or therapeutic use. Rather than relying on a single AI model to handle everything, the researchers tasked an AI “Principal Investigator” with assembling a virtual research team of AI agents! After some internal deliberation, the AI Principal Investigator selected an AI immunologist, an AI machine learning specialist, and an AI computational biologist. The researchers made sure to add an additional role, one of a “scientific critic” to help ground and challenge the virtual lab team’s thinking.

    The team composition and phases of work planned and carried out by the AI principal investigator
    (Source: Figure 2 from Swanson et al.)

    What makes this approach fascinating is how it mirrors high functioning human organizational structures. The AI team conducted meetings with defined agendas and speaking orders, with a “devil’s advocate” to ensure the ideas were grounded and rigorous.

    Example of a virtual lab meeting between the AI agents; note the roles of the Principal Investigator (to set agenda) and Scientific Critic (to challenge the team to ground their work)
    (Source: Figure 6 from Swanson et al.)

    One tactic that the researchers said helped with boosting creativity that is harder to replicate with humans is running parallel discussions, whereby the AI agents had the same conversation over and over again. In these discussions, the human researchers set the “temperature” of the LLM higher (inviting more variation in output). The AI principal investigator then took the output of all of these conversations and synthesized them into a final answer (this time with the LLM temperature set lower, to reduce the variability and “imaginativeness” of the answer).

    The use of parallel meetings to get “creativity” and a diverse set of options
    (Source: Supplemental Figure 1 from Swanson et al.)

    The results? The AI team successfully designed nanobodies (small antibody-like proteins — this was a choice the team made to pursue nanobodies over more traditional antibodies) that showed improved binding to recent SARS-CoV-2 variants compared to existing versions. While humans provided some guidance, particularly around defining coding tasks, the AI agents handled the bulk of the scientific discussion and iteration.

    Experimental validation of some of the designed nanobodies; the relevant comparison is the filled in circles vs the open circles. The higher ELISA assay intensity for the filled in circles shows that the designed nanbodies bind better than their un-mutated original counterparts
    (Source: Figure 5C from Swanson et al.)

    This work hints at a future where AI teams become powerful tools for human researchers and organizations. Instead of asking “Will AI replace humans?”, we should be asking “How can humans best orchestrate teams of specialized AI agents to solve complex problems?”

    The implications extend far beyond scientific research. As businesses grapple with implementing AI, this study suggests that success might lie not in deploying a single, all-powerful AI system, but in thoughtfully combining specialized AI agents with human oversight. It’s a reminder that in both human and artificial intelligence, teamwork often trumps individual brilliance.

    I personally am also interested in how different team compositions and working practices might lead to better or worse outcomes — for both AI teams and human teams. Should we have one scientific critic, or should their be specialist critics for each task? How important was the speaking order? What if the group came up with their own agendas? What if there were two principal investigators with different strengths?

    The next frontier in AI might not be building bigger models, but building better teams.

  • A Digital Twin of the Whole World in the Cloud

    As a kid, I remember playing Microsoft Flight Simulator 5.0 — while I can’t say I really understood all the nuances of the several hundred page manual (which explained how ailerons and rudders and elevators worked), I remember being blown away with the idea that I could fly anywhere on the planet and see something reasonably representative there.

    Flash forward a few decades and Microsoft Flight Simulator 2024 can safely be said to be one of the most detailed “digital twins” of the whole planet ever built. In addition to detailed photographic mapping of many locations (I would imagine a combination of aerial surveillance and satellite imagery) and an accurate real world inventory of every helipad (including offshore oil rigs!) and glider airport, they also simulate flocks of animals, plane wear and tear, how snow vs mud vs grass behave when you land on it, wake turbulence, and more! And, just as impressive, it’s being streamed from the cloud to your PC/console when you play!

    Who said the metaverse is dead?


  • Making a Movie to Make Better Video Encoding

    Until I read this Verge article, I had assumed that video codecs were a boring affair. In my mind, every few years, the industry would get together and come up with a new standard that promised better compression and better quality for the prevailing formats and screen types and, after some patent licensing back and forth, the industry would standardize around yet another MPEG standard that everyone uses. Rinse and repeat.

    The article was an eye-opening look at how video streamers like Netflix are pushing the envelope on using video codecs. Since one of a video streamer’s core costs is the cost of video bandwidth, it would make sense that they would embrace new compression approaches (like different kinds of compression for different content, etc.) to reduce those costs. As Netflix embraces more live streaming content, it seems they’ll need to create new methods to accommodate.

    But what jumped out to me the most was that, in order to better test and develop the next generation of codec, they produced a real 12 minute noir film called Meridian (you can access it on Netflix, below is someone who uploaded it to YouTube) which presents scenes that have historically been more difficult to encode with conventional video codecs (extreme lights and shadows, cigar smoke and water, rapidly changing light balance, etc).

    Absolutely wild.


  • Games versus Points

    The Dartmouth College Class of 2024, for their graduation, got a very special commencement address from tennis legend Roger Federer.

    There is a wealth of good advice in it, but the most interesting point that jumped out to me is that while Federer won a whopping 80% of the matches he played in his career, he only won 54% of the points. It underscores the importance of letting go of small failures (“When you lose every second point, on average, you learn not to dwell on every shot”) but also of keeping your eye on the right metric (games, not points).


  • Biopharma scrambling to handle Biosecure Act

    Strong regional industrial ecosystems like Silicon Valley (tech), Boston (life science), and Taiwan (semiconductors) are fascinating. Their creation is rare and requires local talent, easy access to supply chains and distribution, academic & government support, business success, and a good amount of luck.

    But, once set in place, they can be remarkably difficult to unseat. Take the semiconductor industry as an example. It’s geopolitical importance has directed billions of dollars towards re-creating a domestic US industry. But, it faces an uphill climb. After all, it’s not only a question of recreating the semiconductor manufacturing factories that have gone overseas, but also:

    • the advanced and low-cost packaging technologies and vendors that are largely based in Asia
    • the engineering and technician talent that is no longer really in the US
    • the ecosystem of contractors and service firms that know exactly how to maintain the facilities and equipment
    • the supply chain for advanced chemicals and specialized parts that make the process technology work
    • the board manufacturers and ODMs/EMSs who do much of the actual work post-chip production that are also concentrated in Asia

    A similar thing has happened in the life sciences CDMO (contract development and manufacturing organization) space. In much the same way that Western companies largely outsourced semiconductor manufacturing to Asia, Western biopharma companies outsourced much of their core drug R&D and manufacturing to Chinese companies like WuXi AppTec and WuXi Biologics. This has resulted in a concentration of talent and an ecosystem of talent and suppliers there that would be difficult to supplant.

    Enter the BIOSECURE Act, a bill being discussed in the House with a strong possibility of becoming a law. It prohibits the US government from working with companies that obtain technology from Chinese biotechnology companies of concern (including WuXi AppTec and WuXi Biologics, among others). This is causing the biopharma industry significant anxiety as they are forced to find (and potentially fund) an alternative CDMO ecosystem that currently does not exist at the level of scale and quality as it does with WuXi.