As of early 2019, the world's fastest supercomputer, Summit () clocks in at around 0.1 exaflops (and cost around $160 million). By some estimates, it will take a full exaflop to simulate a human brain's brain with sufficient resolution to reproduce a human mind - and that's after we also solve the problems of knowing which parts of the neurons and related cells need which algorithms, and scanning a brain in sufficient detail (which will likely take around a full exabyte of RAM).
But the number of watts-per-flop is still improving at a rate close to Moore's law, we've got the OpenWorm project, and there are a non-zero number of people willing to have their brains microtomed ("diced") to create an AI that remembers being them (whether or not said AI actually counts as being them by any particular standard). I've drawn up a spreadsheet projecting the likely range of computer improvements, including some known limits, and those limits are a good ways off. I've kept thinking of computers in the next few decades as "just like what we have now, but slimmer and with better battery life"; now I've got a better intuition of what's to come. And, with a little luck and the powers of art and humour, maybe I can share that.
We start with my New Fursona ( https://sfw.furaffinity.net/view/23074211/ ), fresh off the assembly line, larger than life and twice as lovely. Two full cargo containers of RAM, CPUs, cooling gear, and a few other bits and bobs - which is still tiny compared to the 6,000 42U server racks she'd take up with 2019-era hardware.