Category: What I’m Reading

  • The Opportunity in Lagging Edge Semiconductors

    While much attention is (rightly) focused on the role of TSMC (and its rivals Samsung and Intel) in “leading edge” semiconductor technology, the opportunity at the so-called “lagging edge” — older semiconductor process technologies which continue to be used — is oftentimes completely ignored.

    The reality of the foundry model is that fab capacity is expensive to build and so the bulk of the profit made on a given process technology investment is when it’s years old. This is a natural consequence of three things:

    1. Very few semiconductor designers have the R&D budget or the need to be early adopters of the most advanced technologies. (That is primarily relegated to the sexiest advanced CPUs, FPGAs, and GPUs, but ignores the huge bulk of the rest of the semiconductor market)
    2. Because only a small handful of foundries can supply “leading edge” technologies and because new technologies have a “yield ramp” (where the technology goes from low yield to higher as the foundry gets more experience), new process technologies are meaningfully more expensive.
    3. Some products have extremely long lives and need to be supported for decade-plus (i.e. automotive, industrial, and military immediately come to mind)

    As a result, it was very rational for GlobalFoundries (formerly AMD’s in-house fab) to abandon producing advanced semiconductor technologies in 2018 to focus on building a profitable business at the lagging edge. Foundries like UMC and SMIC have largely made the same choice.

    This means giving up on some opportunities (those that require newer technologies) — as GlobalFoundries is finding recently in areas like communications and data center — but provided you have the service capability and capacity, can still lead to not only a profitable outcome, but one which is still incredibly important to the increasingly strategic semiconductor space.


  • NVIDIA to make custom AI chips? Tale as old as time

    Every standard products company (like NVIDIA) eventually gets lured by the prospect of gaining large volumes and high margins of a custom products business.

    And every custom products business wishes they could get into standard products to cut their dependency on a small handful of customers and pursue larger volumes.

    Given the above and the fact that NVIDIA did used to effectively build custom products (i.e. for game consoles and for some of its dedicated autonomous vehicle and media streamer projects) and the efforts by cloud vendors like Amazon and Microsoft to build their own Artificial Intelligence silicon it shouldn’t be a surprise to anyone that they’re pursuing this.

    Or that they may eventually leave this market behind as well.


  • Which jobs are the most [insert gender or race]?

    Fascinating data from the BLS on which jobs have the greatest share of a particular gender or race. The following two charts are from the WSJ article I linked. I never would have guessed that speech-language pathologists (women), property appraisers (white), postal service workers (black), or medical scientists (Asian) would have such a preponderance of a particular group.


  • Trouble in commercial real estate

    Commercial real estate (and, by extension, community banks) are in a world of hurt as hybrid/remote work, higher interest rates, and property bubbles deflating/popping collide…


    The Brutal Reality of Plunging Office Values Is Here
    Natalie Wong & Patrick Clark | Bloomberg

  • Stocks for the Long Run? Maybe not all the Time

    One of the core assumptions of modern financial planning and finance is that stocks have better returns over the long-run than bonds.

    The reason “seems” obvious: stocks are riskier. There is, after all, a greater chance of going to zero since bond investors come before stock investors in a legal line to get paid out after a company fails. Furthermore, stocks let an investor participate in the upside (if a company grows rapidly) whereas bonds limits your upside to the interest payments.

    A fascinating article by Santa Clara University Professor Edward McQuarrie published in late 2023 in Financial Analysts Journal puts that entire foundation into doubt. McQuarrie collects a tremendous amount of data to compute total US stock and bond returns going back to 1792 using newly available historical records and data from periodicals from that timeframe. The result is a lot more data including:

    • coverage of bonds and stocks traded outside of New York
    • coverage of companies which failed (such as The Second Bank of the United States which, at one point, was ~30% of total US market capitalization and unceremoniously failed after its charter was not renewed)
    • includes data on dividends (which were omitted in many prior studies)
    • calculates results on a capitalization-weighted basis (as opposed to price-weighted / equal-weighted which is easier to do but less accurately conveys returns investors actually see)

    The data is fascinating, as it shows that, contrary to the opinion of most “financial experts” today, it is not true that stocks always beat bonds in the long-run. In fact, much better performance for stocks in the US seems to be mainly a 1940s-1980s phenomena (see Figure 1 from the paper below)

    Stock and bond performance (normalized to $1 in 1792, and renormalized in 1982) on a logarithmic scale
    Source: Figure 1, McQuarrie et al

    Put another way, if you had looked at stocks vs bonds in 1862, the sensible thing to tell someone was “well, some years stocks do better, some years bonds do better, but over the long haul, it seems bonds do better (see Table 1 from the paper below).

    The exact opposite of what you would tell them today / having only looked at the post-War world.

    Source: Table 1, McQuarrie et al

    This problem is compounded if you look at non-US stock returns where, even after excluding select stock market performance periods due to war (i.e. Germany and Japan following World War II), focusing even on the last 5 decades shows comparable performance for non-US stocks as non-US government bonds.

    Even assumptions viewed as sacred, like how stocks and bonds can balance each other out because their returns are poorly correlated, shows huge variation over history — with the two assets being highly correlated pre-Great Depression, but much less so (and swinging wildly) afterwards (see Figure 6 below)

    Stock and Bond Correlation over Time
    Source: Figure 6, McQuarrie et al

    Now neither I nor the paper’s author are suggesting you change your fundamental investment strategy as you plan for the long-term (I, for one, intend to continue allocating a significant fraction of my family’s assets to stocks for now).

    But, beyond some wild theorizing on why these changes have occurred throughout history, what this has reminded me is that the future can be wildly unknowable. Things can work one way and then suddenly stop. As McQuarrie pointed out recently in a response to a Morningstar commenter, “The rate of death from disease and epidemics stayed at a relatively high and constant level from 1793 to 1920. Then advances in modern medicine fundamentally and permanently altered the trajectory … or so it seemed until COVID-19 hit in February 2020.”



    Stocks for the Long Run? Sometimes Yes, Sometimes No
    Edward F. McQuarrie | Financial Analysts Journal

  • InVision founder retro

    As reported in The Information a few days ago, former design tool giant InVision, once valued at $2 billion, is shutting down at the end of this year.

    While much of the commentary has been about Figma’s rapid rise and InVision’s inability to respond, I saw this post on Twitter/X from one of InVision’s founders Clark Valberg about what happened. The screenshotted message he left is well-worth a read. It is a great (if slightly self-serving / biased) retrospective.

    As someone who was a mere bystander during the events (as a newly minted Product Manager working with designers), it felt very true to the moment.

    I remember being blown away by how the entire product design community moved to Sketch (from largely Adobe-based solutions) and then, seemingly overnight, from Sketch to Figma.

    While it’s fair to criticize the leadership for not seeing web-based design as a place to invest, I think the piece just highlights how because it wasn’t a direct competitor to InDesign (but to Sketch & Adobe XD) and because the idea of web-based wasn’t on anyone’s radar at the time, it became a lethal blind spot for the company. It’s Tech Strategy 101 and perfectly highlights Andy Grove’s old saying: “(in technology,) only the paranoid survive”.


    Tweet from @ClarkValberg
    Clark Valberg | Twitter/X

  • Public EV chargers suck

    We have a Nissan Ariya and currently DON’T have a home charger (yet — waiting on solar which is another boondoggle for another post). As we live in a town with abundant EVGo chargers (and the Ariya came with 1 yr of free EVGo charging), we thought we could manage.

    When it works, its amazing. But it doesn’t … a frustrating proportion of the time. And, as a result, we’ve become oddly superstitious about which chargers we go to and when.

    I’m glad the charging companies are aware and are trying to address the problem. As someone who’s had to ship and support product, I also recognize that creating charging infrastructure in all kinds of settings which need to handle all kinds of electric vehicles is not trivial.

    But, it’s damn frustrating to not be able to count on these (rest assured, we will be installing our own home charger soon), so I do hope that future Federal monies will have strict uptime requirements and penalties. Absent this, vehicle electrification becomes incredibly difficult outside of the surburban homeowner market.


  • Will Hong Kong Put the Fear Into China’s Property Builders?

    The collapse of China’s massive property bubble is under way and it is wreaking havoc as significant amounts of the debt raised by Chinese property builders is from offshore investors.

    Because of (well-founded) concerns on how Chinese Mainland courts would treat foreign concerns, most of these agreements have historically been conducted under Hong Kong law. As a result, foreign creditors have (understandably) hauled their deadbeat Chinese property builder debtors to court there.

    While the judgements (especially from Linda Chan, the subject of this Bloomberg article) are unsurprisingly against the Chinese property builders (who have been slow to release credible debt restructuring plans), the big question remains whether the Mainland Chinese government will actually enforce these rulings. It certainly would make life harder on (at least until recently very well-connected) Chinese property builders at a moment of weakness in the sector.

    But, failure to do so would also hurt the Chinese government’s goal of encouraging more foreign investment: after all, why would you invest in a country where you can’t trust the legal paper?


  • Amazon Delivers More US Packages than UPS and Fedex

    It’s both unsurprising but also astonishing at the same time.


  • Going from Formula One to Odd One Out

    Market phase transitions have a tendency to be incredibly disruptive to market participants. A company or market segment used to be the “alpha wolf” can suddenly find themselves an outsider in a short time. Look at how quickly Research in Motion (makers of the Blackberry) went from industry darling to laggard after Apple’s iPhone transformed the phone market.

    Something similar is happening in the high performance computing (HPC) world (colloquially known as supercomputers). Built to do the highly complex calculations needed to simulate complex physical phenomena, HPC was, for years, the “Formula One” of the computing world. New memory, networking, and processor technologies oftentimes got their start in HPC, as it was the application that was most in need of pushing the edge (and had the cash to spend on exotic new hardware to do it).

    The use of GPUs (graphical processing units) outside of games, for example, was a HPC calling card. NVIDIA’s CUDA framework which has helped give it such a lead in the AI semiconductor race was originally built to accelerate the types of computations that HPC could benefit from.

    The success of Deep Learning as the chosen approach for AI benefited greatly from this initial work in HPC, as the math required to make deep learning worked was similar enough that existing GPUs and programming frameworks could be adapted. And, as a result, HPC benefited as well, as more interest and investment flowed into the space.

    But, we’re now seeing a market transition. Unlike with HPC which performs mathematical operations requiring every last iota of precision on mostly dense matrices, AI inference works on sparse matrices and does not require much precision at all. This has resulted in a shift in industry away from software and hardware that works for both HPC and AI and towards the much larger AI market specifically.

    Couple that with the recent semiconductor shortage (making it harder and more expensive to build HPC system with the latest GPUs) and the fact that research suggests some HPC calculations are more efficiently simulated with AI methods than actually run (in the same way that NVIDIA now uses AI to take a game rendered at a lower resolution and simulate what it would look like at a higher resolution more effectively than actually rendering the game at a higher resolution natively), I think we’re beginning to see traditional HPC shift from “Formula One of computing” to increasingly the “odd one out”.


    Trying to Do More Real HPC in an Increasingly AI World
    Timothy Prickett Morgan | The Next Platform

  • Are Driverless Cars Safer? (This time with data)

    I’m over two months late to seeing this study, but a brilliant study design (use insurance data to measure rate of bodily injury and property damage) and strong, noteworthy conclusion (doesn’t matter how you cut it, Waymo’s autonomous vehicle service resulted in fewer injuries per mile and less property damage per mile than human drivers in the same area) make this worthwhile to return to! Short and sweet paper from researchers from Waymo, Swiss Re (the re-insurer), and Stanford that is well worth the 10 minute read!


  • AI for Defense

    My good friend Danny Goodman (and Co-Founder at Swarm Aero) recently wrote a great essay on how AI can help with America’s defense. He outlines 3 opportunities:

    • “Affordable mass”: Balancing/augmenting America’s historical strategy of pursuing only extremely expensive, long-lived “exquisite” assets (e.g. F-35’s, aircraft carriers) with autonomous and lower cost units which can safely increase sensor capability &, if it comes to it, serve as alternative targets to help safeguard human operators
    • Smarter war planning: Leveraging modeling & simulation to devise better tactics and strategies (think AlphaCraft on steroids)
    • Smarter procurement: Using AI to evaluate how programs and budget line items will actually impact America’s defensive capabilities to provide objectivity in budgeting

  • Why Childcare *is* Economic Infrastructure

    As a parent myself, few things throw off my work day as much as a wrench in my childcare — like a kid being sick and needing to come home or a school/childcare center being closed for the day. The time required to change plans while balancing work, the desire to check-in on your child throughout the work day to make sure they’re doing okay… and this is as someone with a fair amount of work flexibility, a spouse who also has flexibility, and nearby family who can pitch in.

    Childcare, while expensive, is a vital piece of the infrastructure that makes my and my spouse’s careers possible — and hence the (hopefully positive 😇) economic impact we have possible. It’s made me very sympathetic to the notion that we need to take childcare policy much more seriously — something that I think played out for millions of households when COVID disrupted schooling and childcare plans.

    The Washington Post’s Catherine Rampell also lays this out clearly in an Opinion piece, tracking how the closure of one Wisconsin day-care had cascading impacts on the affected parents and then their employers.


  • Good Windows on ARM at last?

    Silicon nerd 🤓 that I am, I have gone through multiple cycles of excited-then-disappointed for Windows-on-ARM, especially considering the success of ChromeOS with ARM, the Apple M1/M2 (Apple’s own ARM silicon which now powers its laptops), and AWS Graviton (Amazon’s own ARM chip for its cloud computing services).

    I may just be setting myself up for disappointment here but these (admittedly vendor-provided) specs for their new Snapdragon X (based on technology they acquired from Nuvia and are currently being sued for by ARM) look very impressive. Biased as they may be, the fact that these chips are performing in the same performance range as Intel/AMD/Apple’s silicon on single-threaded benchmarks (not to mention the multi-threaded applications which work well with the Snapdragon X’s 12 cores) hopefully bodes well for the state of CPU competition in the PC market!

    Single-threaded CPU performance (Config A is a high performance tuned offering, Config B is a “thin & light” configuration)
    Multi-threaded CPU performance (Config A is a high performance tuned offering, Config B is a “thin & light” configuration)

    Qualcomm Snapdragon X Elite Performance Preview: A First Look at What’s to Come
    Ryan Smith | Anandtech

  • Complex operations for gene editing therapies

    Gene editing makes possible new therapies and actual cures (not just treatments) that were previously not. But, one thing that doesn’t get discussed a great deal is how these new gene editing-based therapies throw the “take two and call me in the morning” model out the window.

    This interesting piece in BioPharmaDive gives a fascinating look at all the steps for a gene editing therapy for sickle cell disease that Vertex Pharmaceuticals is awaiting FDA approval for. The steps include:

    • referral by hematologist (not to mention insurance approval!)
    • collection of cells (probably via bone marrow extraction)
    • (partial) myeloablation of the patient
    • shipping the cells to a manufacturing facility
    • manufacturing facility applies gene editing on the cells
    • shipping of cells back
    • infusion of the gene edited cells to the patient (so they hopefully engraft back in their bone marrow)

    Each step is complicated and has their own set of risks. And, while there are many economic aspects of this that are similar to more traditional drug regimens (high price points, deep biological understanding of disease, complicated manufacturing [esp for biologicals], medical / insurance outreach, patient education, etc.), gene editing-based therapies (which can also include CAR-T therapy) now require a level of ongoing operational complexity that the biotech/pharmaceutical industries will need to adapt to if we want to bring these therapies to more people.


  • The Problem with the MCU

    Something is wrong with the state of the Marvel Cinematic Universe (MCU).

    In 2019, Disney/Marvel topped off an amazing decade-plus run of films with Avengers: Endgame, becoming (until Avatar was re-released in China) the highest grossing film of all time. This was in spite of an objectively complicated plot which required a deep understanding of all of Marvel Cinematic Universe continuity to follow.

    And yet critics and fans (myself included! 🙋🏻‍♂️) loved it! It seemed like Marvel could do no wrong.

    It doesn’t feel that way anymore. While I’ve personally enjoyed Black Panther: Wakanda Forever and Shang-Chi, this Time article does a good job of critiquing how complicated the MCU has become, so much so that a layperson can’t just watch one casually.

    But it misses one additional thing which I think gets to the heart of why the MCU just doesn’t feel right anymore. The MCU is now so commercially large, that the scripts feel like they’re written by a committee of businesspeople (oh make sure you’re setting up this other show/movie! let’s get in an action scene with some kind of viral quip!) rather than writers/directors trying to tell an entertaining story for the sake of the story.

    And until they get to that, I’m not sure even Marvel’s plans to cut down on the number of productions will deliver.


    How Marvel Lost Its Way
    Eliana Dockterman | Time

  • The Parents Trying to Pass Down a Language They Hardly Speak

    This article resonates with me on so many levels: both as the child who came to the US and saw his language skills deteriorate as he assimilated and as the parent trying to preserve his kids’ connection to their cultural heritage


  • Pixel’s Parade of AI

    I am a big Google Pixel fan, being an owner and user of multiple Google Pixel line products. As a result, I tuned in to the recent MadeByGoogle stream. While it was hard not to be impressed with the demonstrations of Google’s AI prowess, I couldn’t help but be a little baffled…

    What was the point of making everything AI-related?

    Given how low Pixel’s market share is in the smartphone market, you’d think the focus ought to be on explaining why “normies” should buy the phone or find the price tag compelling, but instead every feature had to tie back to AI in some way.

    Don’t get me wrong, AI is a compelling enabler of new technologies. Some of the call and photo functionalities are amazing, both as technological demonstrations but also in terms of pure utility for the user.

    But, every product person learns early that customers care less about how something gets done and more about whether the product does what they want it too. And, as someone who very much wants a meaningful rival to Apple and Samsung, I hope Google doesn’t forget that either.


  • Artificial Wombs

    Anyone who’s ever seen a NICU or statistics on the health outcomes for extremely premature babies understands how artificial womb technology could be incredibly impactful.

    At the same time, it poses some tricky ethical challenges:

    • How can you get informed, ethical consent to trial this? It’s nigh impossible to predict who will need a premature delivery.
    • In places like the US with limitations / controversy on reproductive rights, how do you push this forward without impacting assessments of “fetal viability” that may impact the abortion debate?

    Great piece in Nature News below 👇🏻


  • The “Large Vision Model” (LVM) Era is Upon Us

    Unless you’ve been under a rock, you’ll know the tech industry has been rocked by the rapid advance in performance by large language models (LLMs) such as ChatGPT. By adapting self-supervised learning methods, LLMs “learn” to sound like a human being by learning how to fill in gaps in language and, by doing so, become remarkably adept at solving not just language problems but understanding & creativity.

    Interestingly, the same is happening in imaging, as models largely trained to fill in “gaps” in images are becoming amazingly adept. A friend of mine, Pearse Keane’s group at University College of London, for instance, just published a model trained using self-supervised learning methods on ophthalmological images which is capable of not only diagnosing diabetic retinopathy and glaucoma relatively accurately, but relatively good at predicting cardiovascular events and Parkinson’s.

    At a talk, Andrew Ng captured it well, by pointing out the parallels between the advances in language modeling that happened after the seminal Transformer paper and what is happening in the “large vision model” world with this great illustration.

    From Andrew Ng (Image credit: EETimes)