This is a valid RSS feed.
This feed is valid, but interoperability with the widest range of feed readers could be improved by implementing the following recommendations.
line 648, column 0: (7 occurrences) [help]
<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-emb ...
line 1294, column 0: (7 occurrences) [help]
<content:encoded><![CDATA[
line 1294, column 0: (7 occurrences) [help]
<content:encoded><![CDATA[
line 1294, column 0: (7 occurrences) [help]
<content:encoded><![CDATA[
line 1294, column 0: (7 occurrences) [help]
<content:encoded><![CDATA[
line 1294, column 0: (7 occurrences) [help]
<content:encoded><![CDATA[
line 1294, column 0: (7 occurrences) [help]
<content:encoded><![CDATA[
line 1294, column 0: (7 occurrences) [help]
<content:encoded><![CDATA[
line 1294, column 0: (7 occurrences) [help]
<content:encoded><![CDATA[
line 1294, column 0: (7 occurrences) [help]
<content:encoded><![CDATA[
line 1294, column 0: (7 occurrences) [help]
<content:encoded><![CDATA[
line 1294, column 0: (7 occurrences) [help]
<content:encoded><![CDATA[
line 1294, column 0: (7 occurrences) [help]
<content:encoded><![CDATA[
line 2018, column 0: (4 occurrences) [help]
<figure class="wp-block-embed is-type-rich is-provider-twitter wp-block-embe ...
<figure class="wp-block-embed is-type-rich is-provider-twitter wp-block-embe ...
<figure class="wp-block-embed is-type-rich is-provider-twitter wp-block-embe ...
<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
xmlns:content="http://purl.org/rss/1.0/modules/content/"
xmlns:wfw="http://wellformedweb.org/CommentAPI/"
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:atom="http://www.w3.org/2005/Atom"
xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
xmlns:georss="http://www.georss.org/georss" xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#" xmlns:media="http://search.yahoo.com/mrss/"
>
<channel>
<title>Scobleizer</title>
<atom:link href="https://scobleizerblog.wordpress.com/feed/" rel="self" type="application/rss+xml" />
<link>https://scobleizerblog.wordpress.com</link>
<description>Spatial Computing strategies.</description>
<lastBuildDate>Thu, 29 Sep 2022 13:11:21 +0000</lastBuildDate>
<language>en</language>
<sy:updatePeriod>
hourly </sy:updatePeriod>
<sy:updateFrequency>
1 </sy:updateFrequency>
<generator>http://wordpress.com/</generator>

<cloud domain='scobleizerblog.wordpress.com' port='80' path='/?rsscloud=notify' registerProcedure='' protocol='http-post' />
<atom:link rel="search" type="application/opensearchdescription+xml" href="https://scobleizerblog.wordpress.com/osd.xml" title="Scobleizer" />
<atom:link rel='hub' href='https://scobleizerblog.wordpress.com/?pushpress=hub'/>
<item>
<title>ANNOUNCEMENT and Why the Tesla Humanoid Robot Matters</title>
<link>https://scobleizerblog.wordpress.com/2022/09/29/announcement-and-why-the-tesla-humanoid-robot-matters/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Thu, 29 Sep 2022 13:11:21 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9592</guid>
<description><![CDATA[Irena Cronin and I are seeing huge shifts coming as autonomous vehicles get to the point where they are driving around San Francisco without humans. We recently started comparing notes and we are seeing the same trends.  So, today we are announcing that I am rejoining Infinite Retina as Chief Strategy Officer. We are bringing … <a href="https://scobleizerblog.wordpress.com/2022/09/29/announcement-and-why-the-tesla-humanoid-robot-matters/" class="more-link">Continue reading <span class="screen-reader-text"><strong>ANNOUNCEMENT and Why the Tesla Humanoid Robot Matters</strong></span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<p>Irena Cronin and I are seeing huge shifts coming as autonomous vehicles get to the point where they are driving around San Francisco without humans. We recently started comparing notes and we are seeing the same trends. </p>
<p>So, today we are announcing that I am rejoining Infinite Retina as Chief Strategy Officer. We are bringing new understanding to entrepreneurs and product strategists on Augmented Reality and everything related to it, including AI and Computer Vision. Here is our first analysis (located at: [url]) on why we should all be paying attention to what is happening in humanoid robots and consumer electronics, which include autonomous vehicles that are now arriving to people’s garages and, soon, Augmented Reality devices from Apple and others.</p>
<p>Tomorrow Elon Musk will step on stage and show us the latest AI and robotics. We think this is a much more important announcement than most people are expecting and here’s an analysis of just how deeply Optimus (Tesla’s humanoid robot), and other humanoid robots, will change all of our homes. </p>
<p>The last time Irena and I collaborated, we wrote a book, <em>Infinite Retina</em>, that Qualcomm’s head of Augmented and Virtual Reality, Hugo Swart, reviewed as a “must read.” This time, in addition to consulting, Irena and I are doing new analyses in the form of a paid product on Augmented Reality topics that we will offer on a monthly basis. One that loves Augmented Reality devices and automated electric cars, and other products that make life more fun and better. </p>
<p class="has-x-large-font-size"><strong>Tesla Robot: Consumer Strategy 2028</strong></p>
<p class="has-medium-font-size">“Knock knock.”<br><br>“Who is there?”<br><br>“Your pizza delivery robot. If you invite me in, I can set up your table and do other tasks.”<br><br>It will be the first time a product introduces itself to consumers at their front doors and once inside will bring a wholesale change to all the brands inside. Most of them will go away. The robot will – over the years – replace “old brands” with “new brands” that do the same thing, but better. It’ll even change the showerheads to new models to save energy and water. </p>
<p>Skeptics are right to point out this won’t happen soon. But by 2028 we expect such a robot will be in people’s homes and the Robotaxi (think of Uber without a human driver) will demand the inclusion of a humanoid robot that can do things like deliver dinner or groceries.</p>
<p>Tesla tomorrow will give us a taste of how advanced its robotics program is and how likely we are to get a humanoid robot that helps us at home in five years or less, along with seeing how well it can learn to do new jobs in the factory first. It also could explain the business model and why many Tesla owners will want a robot in their home (it could be a key piece of the RoboTaxi network – plugging in cars to charge them and get them back on the road).</p>
<p>There will be other insights, too. </p>
<p>The catalyst to write this analysis is that we are both seeing signs of a changing consumer, due to Spatial Computing technologies like autonomous vehicles, Augmented Reality, and, particularly, robots.</p>
<p>If you buy into the premise that we are about to see changes in the technologies that go into robots – the AI, the electric motors, the sensor arrays, and in how, even, humans are living – then you will accept that the person who is interacting with the robot will change that person from deciding on the brand of soap used in the home, for instance, to letting the robot decide. In our research we’ve found that humans will accept these kinds of changes faster than most consumer products companies believe they will.</p>
<p>These changes go far beyond showerheads or the soap brand you use to wash your clothes, though.</p>
<p>It brings with it a bunch of new technologies that could disrupt even Apple, Google, or Amazon, but soon will start bringing service after service to your home. </p>
<p>The robot brings other robots. (The autonomous vehicle, er, a robot, will bring the humanoid robot to your home, which will bring other, more specialized robots in. This turns everything into a service).</p>
<p>That statement alone brings radical shifts to the economy. </p>
<p>Why hasn’t this happened yet?</p>
<ol class="wp-block-list">
<li>Until now robots were too expensive to be used for general consumer uses.</li>
<li>No distribution or business model existed to entice homeowners to afford a fairly expensive new machine. How many homes can afford to pay $50,000 for one?</li>
<li>The AI or software that controls robots was also very expensive and specialized. A robot at Ford’s plant in Detroit puts windshields into trucks every minute. But it can’t fold laundry. The humanoid robot could do both tasks, which points to similar changes coming to workplaces and factories. Our writing here focuses more on the consumer changes, but our previous book covered both and we bet the same will be true of this newsletter in the future.</li>
</ol>
<p>All three issues holding back humanoid robots are going away at a pretty fast rate. </p>
<p>All of the technologies that go into a humanoid robot are coming down in price at a pretty constant rate and are also becoming more capable at about the same rate, too, so you get an exponential improvement on the number of things a robot can do over time, There are already many robots that do things from vacuum floors to clean windows to ones that pick weeds out of your garden. Plus the efficiency of the computers and motors that would drive its hands and legs is getting better over time, so we can now see it doing real work for hours on one charge.</p>
<p>Back to the autonomous vehicle, and its role in turning everything into a service, which is that it will bring other robots to the home.</p>
<p>Once cars start driving without a human in the car, something that GM’s Cruise, Waymo (spun out of Google), and others are already doing in San Francisco, California and Phoenix, Arizona, then the car can bring other robots AND let the robots be shared among a number of different houses, which defrays their cost.</p>
<p>This piece – how to get robots into homes – is what previous robotic companies, now out of business, like Willow Garage or Giant AI were missing. What is that? How to get the robots to be paid for. A $50,000 robot isn’t an expense many can afford, even in richer neighborhoods. The autonomous vehicle unlocks a new business model of turning everything into a service and sharing the robot’s cost amongst many homes. </p>
<p>Autonomous vehicles will, alone, bring an upheaval as consumers move away from owning cars and toward “transportation as a service.” What does that mean? The best example today is Uber. You pull out your phone and you order a car. You pay for what you use. If you only take one trip a month to the local shopping mall, you’ll pay $20. Far less than the $400 a month a new Toyota costs. </p>
<p>The humanoid robot could do the same. Could do your laundry for $100 a week, then move next door, where it could do the same for your neighbor, collecting another $100, and so on and so forth. And if it can do laundry, it can do a lot more in the home or even your business.</p>
<p>When you add autonomous vehicles, humanoid robots, and other major technology shifts like Augmented Reality and virtual beings, that will arrive in 2023, you see not just an economic upheaval but an almost complete change to what it means to be human. </p>
<p>The last time we (Irena Cronin and Robert Scoble) studied the market together the result, <em>The Infinite Retina</em>, earned a “must read” review from Qualcomm’s head of augmented and virtual reality products, Hugo Swart (Qualcomm makes the chips inside everyone’s headsets other than Apple). We reconnected recently after realizing that we were both seeing the same trends from different points of view that very few others were seeing, or studying.<br><br>Why now?<br><br>There are multiple autonomous vehicle companies now driving around without humans. Yes, not many cities yet, but that will change. </p>
<p>That, alone, sets up deep changes to economies around the world as more passenger miles, shipping, and other services change from human driven to AI driven. When it also brings humanoid robots into the home, while Apple brings Augmented Reality to the home at the same time, we see something far more profound happening than we saw when we wrote <em>The Infinite Retina</em> two years ago. </p>
<p>Welcome to the “everything as a service” world and stay tuned to insights from both of us. </p>
<p>Why now? Because Tesla is updating the status of their Optimus humanoid robot and possibly demonstrating an early version of it on September 30, 2022.</p>
<p>And, yes, the Optimus will push the doorbell button instead of knocking, if you have one.</p>
<p><strong><em>Life with Tesla Optimus</em></strong></p>
<p>The first Tesla Cybertrucks will already be a few years old when Tesla’s humanoid robot arrives to the first waves of consumers, but what will it do when it arrives in 2028?</p>
<p>Well, first of all, we need to talk about the Cybertruck. By 2028 it will be driving around most cities without a human in it, along with other vehicles Tesla makes. When that happens the necessary pre-conditions for humanoid robots will be here. One will walk off the production line and jump into a waiting Cybertruck, which will bring the humanoid robot to people’s homes. Others will go into crates to be shipped around the world to both manufacturing and home users. Once released from their crates they will be able to hop into a Tesla and other vehicles.</p>
<p>How many more years after that will you see Tesla robots everywhere in Western society? 2030? Certainly by 2035. </p>
<p>They will help you load things into your truck at your local Home Depot, loading even heavy sheets of sheetrock. In fact, Home Depot could order many humanoid robots for each store. Such a store would quickly become nicer than Lowe’s, if Lowe’s doesn’t also have the same robots.</p>
<p>Which leads to a lesson: every business soon will have to change deeply to attract partnerships with Tesla and others who will want to compete with Tesla as we move into the “Everything as a Service” world. More on how we see businesses changing later.</p>
<p>Let’s go back to that original pizza delivery. What needs to happen to make that happen?</p>
<ol class="wp-block-list">
<li>The robot has to be made.</li>
<li>The AI has to be capable enough to go into, say, a Round Table pizza restaurant, and be able to talk with the humans there who are behind the counter — “Hi, I’m here to pick up two large pepperoni pizzas for Irena Cronin.”</li>
<li>The robot has to be able to get to Round Table, get out of the vehicle, walk over any obstacle like mud, grass, dog poop, curbs, sand, stairs, etc., and get to both the counter at Round Table as well as the front door of your home while carrying the pizzas in a thermal pouch to keep them piping hot.</li>
</ol>
<p>If it just did that, it would unlock the business model. But, the robot also has to be programmed to interact with people. So, it has to understand people deeply, and, even, have a personality to get to its fullest potential.</p>
<p>Why? Trust.</p>
<p>Would you <em>trust</em> a robot that just did what you told it with no personality? Not as much as if it talked to you in a human way, and, even, entertained you. Adrian Kaehler discovered this while running the Giant AI company (now out of business, but it was working with factory owners to build a humanoid robot run by AI, just like Tesla is). He discovered that when they made their robot look, and act, more like a human that people accepted it more readily than their earlier prototypes that just looked like a machine with hands.</p>
<p>Trust will soon be the most important score companies track about itself and its products/services. </p>
<p>Consumers’ attitudes toward computers doing intimate things with them, like cooking in the kitchen together, will soon deeply change due to what the autonomous vehicle will impact them.<br></p>
<p>Turns out once you trust a computer to drive you around the world, you change as a consumer – you become far <em>more</em> likely to let an AI run your life after that, since you realize that the AI doesn’t kill you while driving you around. After that, trusting a company to have a robot in your home doesn’t seem nearly as far-fetched a proposition as before you put your life in a robot’s hands (autonomous vehicles are technically robots, too).</p>
<p>So, what will you trust your humanoid robot to do? What will its day be like?</p>
<p>Well, laundry, dishes, shopping, cleaning, gardening, security, maintenance, and, even, saving your life. Future robots will be able to perform CPR on you, saving your life if you have a heart attack in your home. It can call 911 while it is doing that too. The operator might not even realize he or she is talking to a computer. “Hi, I’m a Tesla Optimus calling on behalf of Mary Smith and I’m currently performing CPR on her, and she is showing symptoms of having a heart attack. She has a pulse, but it is a weak one.”</p>
<p>Not every day will be so dramatic as saving a life for the robot. </p>
<p>In fact, the first robots we see will usually be pretty simplistic, at first. What is the low hanging fruit it will pick first? Deliveries! Yes, your first humanoid robot will probably arrive at your door with some pizzas or groceries. </p>
<p><strong>Take a Ride in the Future</strong><strong><br></strong><br>A truck is rolling up with your pizza delivery. It is the first thing the humanoid robot will do as a service. Why? If you can’t deliver pizza you can’t deliver anything else. So it will be how many people have their first encounter with a humanoid robot. </p>
<p>You are watching from your home’s front window as a human-looking robot hops out of the passenger seat, walks to the back of the truck, and grabs a heated and insulated bag of pizza from the cargo area and starts walking up to your house. </p>
<p>You had heard they were coming, from TikTok videos from other neighborhoods. You knew they could talk with you, show you things on its screen, and that they can ring your doorbell, but so far, even though they moved gracefully and quickly, they couldn’t yet enter the home. This was just an introduction to the robot. Even so, the experience is so different and unique that people record their first meetings on their glasses and phones and share them on social media or through direct messages to family members and friends. “It’s here.”<br><br>“Hello Irena Cronin, we have two pizzas for you. Here is what they looked like when they were put into the box.” (The robot’s face turns to a screen where it shows both pizza photos being put into the box).</p>
<p>“Hope you like your order. Today I can’t come into your home, but starting next week, we can do some common tasks in the home for a low monthly price, but we’ll do the first six months for free. Washing dishes and doing your laundry. Plus we can monitor your home for you, making it safer and more efficient. All if you want, of course. If you do, download the “Tesla Robot” app to your Augmented Reality glasses (a barcode appears on the robot’s face). It has been a pleasure serving you and your family. You can call me anytime with the Tesla app. Thank you.”<br><br>“It’s weird talking to a robot who just handed me pizza.”<br><br>“We get that a lot. Hey, it’s weird for us too! We had to figure out how to get here without ever being here before.”</p>
<p>The AI inside the robot has everything published on the Internet, and quite a few other data sources to pull from in milliseconds. A conversational AI is already planning out potential things the robot can say to you in response to what you say to it. It knows how likely you are to laugh at its jokes before it tells you one. If you laugh earlier or harder than usual, that will be noted in a database about your humor preferences.</p>
<p>But let’s not get into the fun and games yet. The first robot is there to serve a business and make a profit, not just tell you jokes. The first business is the pizza delivery service.</p>
<p>It will be followed by thousands of services, all controlled by you as long as you are talking to either the Tesla app on your glasses, or the same on one of your older tablets, phones, or computers. As long as you are within earshot of the Tesla Optimus and as soon as it verifies your identity, which usually is done before you even start talking, you also have complete control of it. Particularly if you own a Tesla vehicle, since you already are running the Tesla app full time to control your vehicle. If you are an owner, you have a virtual robot in your Augmented Reality headset that looks, talks, walks, exactly like the real robot. It can walk next to you, you will think you have a real robot walking alongside of you. At least until you say something like “Hey, Tesla, can you change my robot into an elephant?” If you have Augmented Reality glasses on, why, yes it can!</p>
<p>To make a business, there are a lot of boring steps that have to happen before the robot walks up to your door and knocks on the door or rings your doorbell. Things like, the thing had to walk into a Round Table pizza shop, wait in line like a human, and then introduce itself to the person behind the counter, and ask for the pizza for Irena. </p>
<p>Also, when it is walking from the curb or parking space to your front door, it has to navigate many different things. Some people have stairs. Some people’s front doors are across weird bridges, some made out of rock and wood, others even rope. We have visited homes all around the world, in China, India, Israel, South Africa, and many European countries, along with homes in Canada and Mexico and have seen this. </p>
<p>Yet others might require walking across some dirt to get to the front door, or navigating past a security guard keeping people who aren’t residents from entering an elevator to the Penthouse Suites. And we haven’t even talked about snow or ice that such a robot would need to navigate without dropping the pizza. </p>
<p>That, alone, will require huge computer science efforts that cost many billions. Many of those billions have already been spent by teams building autonomous vehicles at places like Google and its Waymo spinout, Apple, Tesla, Mercedes Benz, Amazon’s Zoox, Nuro, GM’s Cruise, Aurora, Wayve, or a few others. But moving a robot through a chaotic environment to your front door will require billions more. Some people live in places with huge crowds right outside their front doors, others live in the middle of forests of trees. A robot will need to navigate all that and interact with people along the way. Every interaction is a potential customer so it has to be nice, funny, trustworthy, all in an attempt to win customers. </p>
<p>Just talking their way past security guards and doormen is quite a challenge for human delivery people. Getting up to apartment 4B isn’t as easy as it looks sometimes. Humans often have to call up a resident to validate they really wanted a pizza delivery. The robot can do that automatically and you can be shown on a video on its face – and if it uses one of the new 3D screens that have been shown around the robot can actually show you what something looks like in 3D on its screen that is on its face, including your face upstairs as you wait for your pizza. 3D telepresence both inside and around a robot. </p>
<p>The big business idea is that the robots (self-driving cars) will bring other robots (humanoid robots) which then will bring other robots (for specialized tasks like vacuuming, cleaning windows and probably a lot more, snowblowing, gardening, and more). </p>
<p>But for the first humanoid robot that gets into the home, there are also other things it can do in addition to delivering pizza:</p>
<ol class="wp-block-list">
<li>Make the home and the people living there more efficient energy users.</li>
<li>Give time back to family to do something better with.</li>
<li>Build a buying club so bulk pricing lowers cost and improves quality of everyday things.</li>
<li>Introduce new kinds of healthcare and other lifestyle services into the home, improving the health of everyone in the home (it can make better quality food, too). It can monitor your health just by walking by you. Imagine you run by one on your exercise routine and it cheers you on just like my family and a variety of strangers did while I ran marathons in high school.</li>
<li>Improve the safety and security of the home (it can be a sentry on home all night long, noting various problems before you wake up).</li>
<li>Make sure you stick with its service and that you don’t kick it out of your home. </li>
<li>Optimize the home, even tracking what clothes you wear by whether they disappeared from the home during the day.</li>
<li>Introduce new experiences to the home. The robot could say “Hey, the robots are gonna watch the Beyonce concert tonight (we’ll even be part of the concert). You wanna come?”</li>
<li>Introduce new bartering systems with your neighbors. Trading of food or, even, tools. “Hey, I’ll pay $5 to borrow a screwdriver.” The robot can arrange all sorts of things to be moved around the neighborhood.</li>
</ol>
<p>Once the robot gets access to the home it can start optimizing it. Looking for things that could be improved. It also is paying attention to the humans in the home, and is building an internal database of things it learns about you as it watches you. “The human Andrea Kaplan likes eating Cheerios at home at about 7 a.m. every morning.”</p>
<p>In the future this knowledge will make it possible to personalize everything, particularly in relation to the robot. If you have a relationship with the robot, even a cold business only one, it could notice you like Cheerios so it has a bowl, spoon, and your Cheerios and milk on your dining room table waiting for you at 7 a.m.</p>
<p>Of course that means it needs to open drawers and find your spoons, and open the refrigerator and find your milk. Even if it is just doing this simple task, isn’t it also making a database of every product next to these things too? Of course it is and that, alone, will teach the AI a lot about your personality and likes and, even, your belief system. Bringing massive changes to what humans believe about privacy. </p>
<p>Why? Well, imagine having a robot that comes into your house that didn’t talk to you in a fun way. Just did the laundry silently, saying few words. Are you likely to take it with you to your friend’s house? Or to a Home Depot to help with picking up some things for your home improvement project? No. </p>
<p>So, we predict it will talk frequently with you about various topics, and, even, high five you at appropriate times. Why? If you feel it really knows you and entertains you, then you will learn to trust it with, say, doing the groceries. </p>
<p>It is this trust that is worth trillions of dollars as the robot takes on more and more things around you, turning them all into services. </p>
<p><strong><em>First Ten Years: Owning the Delivery and Home Services Markets</em></strong></p>
<p><strong>Expectations for Tesla Optimus</strong></p>
<p>Here are the parameters, among others, that the Tesla Optimus will need to meet our expectations before it can operate in people’s homes, and we will be watching the Tesla AI event for how well it can do each of these:<br><br>1. It should lift at least 55 lbs. Why that much? That is what you can pack and check on an airline. It might need to assist someone loading such a bag into a trunk.</p>
<p>2. It needs to be very quiet. Even when moving around you should never hear it, or only when it has to open a drawer or a door. On the other hand, that might be unnerving for people, so “Hey Tesla can you play some music while walking around?”</p>
<p>3. It needs to be able to communicate, via voice and hand signals, along with a screen on its face with humans. Switching modes to what the human prefers. For instance, the robot could switch to sign language for a deaf customer.</p>
<p>4. It needs to walk fast enough to keep up with a human entering, say, a RoundTable Pizza. Oh, heck, Boston Dynamics has robots that do parkour (jumping off of buildings), so maybe we need a little more than just a slow walk, no?</p>
<p>5. It needs to be able to get into, and out of, a Tesla vehicle, including putting on and off a seat belt. For extra credit, it could “assist” the car in driving tasks, for instance, by using its higher resolution cameras to see further and have better data to more accurately predict speed of oncoming traffic.</p>
<p>6. It must figure out how to either knock on the door (without leaving a mark) or ring the doorbell.</p>
<p>7. It must be able to carry a package of goods, such as pizzas, from the cargo area to the front door while always keeping them horizontal. Same with a cake. Same with eggs. Can’t break anything or drop anything.</p>
<p>8. It must show the beginnings of a personality with ability to entertain and delight. In other words, it must have conversational skills that so far computers haven’t demonstrated.<br>9. It must prove that it will be able to bring more services into the home than is possible otherwise (business model of robot bringing other robots).</p>
<p>10. It must demonstrate that it will never hurt humans, children or animals.</p>
<p>We’ll also be watching for skills that will be needed in both factory work as well as home service work. For instance, can it install a towel rack at home? The skills it would need will be similar to putting in an electric engine into a vehicle on an assembly line.</p>
<p>Why wouldn’t Tesla own the delivery and home services markets if it delivered a humanoid robot that does all that?</p>
<p><strong>Data, Data, Everywhere</strong></p>
<p>Our thesis is that the biggest dataset wins a lot.</p>
<p>It isn’t just our thesis, either. Many strategists at many companies are trying to find new sources of data. NVIDIA laid out the ultimate end of this strategy: one datasystem that drives everything: robots, autonomous vehicles, Augmented Reality, and virtual beings. </p>
<p>We call this new strategy “the data hydra.” NVIDIA’s Omniverse is the best laid out example, but others are being built at Tesla, Apple, Google, Niantic, Meta, Bytedance, among others.</p>
<p>On September 20th, 2022, NVIDIA announced new features of its Omniverse. At the heart is a simulator that lets AI teams train the system to do new things. Or study a mistake it made by walking around the intersection where an accident occured. </p>
<p>This hydra needs the data to build a digital twin of everything. What is a digital twin? Think of it as a very faithful digital copy of the real world. Our factories, malls, parks, and other things will soon all have at least one copy. In some places, like Times Square, we can see that there will be hundreds of millions of copies. You could leave pictures or videos of your family on top of this digital twin. And that is just the start. By 2025, Lumus, an optics company building the displays that will be in future Augmented Reality glasses, showed us that this digital twin will let us watch a concert in a new way. All around our couch will be a music festival and, thanks to Spatial Audio, it’ll sound concert level too. In some cases what people will hear in their AirPods Pro will be better than what they will hear at a professional concert. Even a high-end one, like Coachella. Augmented Reality headphones there “augmented” the audio, making it better. You could turn up the bass, for instance, or remove crowd noise or, turn down the concert to a more acceptable level. Business travelers already know that the best noise canceling headphones block out a screaming baby in the seat next to you. </p>
<p>Adrian Kaehler, a computer vision pioneer who built for the first autonomous vehicle at Stanford and was an early key exec at Magic Leap, started a humanoid robotics company, Giant AI. That company failed to get enough funding. Why? If you start analyzing any job that a robot might do, you can see that a humanoid robot that can walk around, learning on its own, will decimate others.</p>
<p>Where Giant took showing the robot six or so times to “teach” the AI how to do a task, like putting material into a machine, thanks to this data advantage, and all that it brings, the Tesla robot will learn after one time, or will “reason” through it. After all, Tesla’s AI can drive down a street it never has seen. Some can joke that it will learn things from watching YouTube, but our kids are already learning that way so the AI can too. We no longer laugh. The AI ingestion engines at foundational models like Dall-e or Stable Diffusion ingest hundreds of millions of images. Soon we will see these kinds of AI’s evolve into new kinds of information interactions (we used to call this searching).</p>
<p>The robot might read you a poem it had generated by another AI, say, GPT-4, all while putting away the groceries. Why? It knew you like poetry and wanted to make you smile.<br><br>Let’s go directly to the point: after looking at how fast all the systems that go into building a robot are improving we now believe the humanoid robot of 2030 will be so good that humans who have one in their home a lot will feel that they are their friends and associates. If they are that good, you will bring one lots of places to “help” you and your family. </p>
<p><strong>Tesla AI and Its Simulator Advantage</strong></p>
<p>Every Tesla car that gets upgraded with its latest AI stack (it calls it FSD Beta) ends up uploading about 30 gigabytes of data up to its neural network in the Tesla cloud (the new version of that Tesla calls “Dojo”). </p>
<p>That feeds a simulator that lets researchers “walk around” what looks like the real world. With moving pedestrians, bikers, hundreds of cars, and more. </p>
<p>It is this simulator that is one of Tesla’s many secret advantages. The simulator shows off the advantage of having huge amounts of data generated by an army of Tesla robots (cars) moving around the world. </p>
<p>It lets AI researchers train new AI models to do new tasks. In 2021 Tesla introduced an autotagger into the system, which brought about a huge shift in how these systems can learn. The AI already knows hundreds of thousands of objects in the real world and automatically tags anything that it knows well. This speeds up the ability for the AI to start automatically learning. </p>
<p>Which is where we are headed. There are plenty of examples of AI simulations and robots that start with knowing nothing, and over time by trying thousands of little experiments, they figure out how to walk and move on their own. </p>
<p>Tesla has the advantage of being able to study humans in a new way while driving around the real world. Already its researchers needed to train its AI models to understand human movement. What the data from a human running, or walking, or biking looks like. It needed to do that to properly behave around humans in the streets. </p>
<p>This research will give Tesla a lead when it comes to building humanoid robots. It will use the simulator to train robots to do a wide variety of tasks, long before Tesla makes a physical robot that can walk into your kitchen. </p>
<p><strong>How Does It Progress Over Ten Years?</strong></p>
<p>The next ten years will see radical change due to Spatial Computing:</p>
<ol class="wp-block-list">
<li>Augmented Reality glasses are worn by the majority of people.</li>
<li>Autonomous vehicles are everywhere on our streets.</li>
<li>Virtual beings hang out with us all day long.</li>
<li>Robots of all types are working all around us.</li>
<li>Many homes now have solar AND backup batteries AND an electric vehicle charging station.</li>
<li>AI systems now ingest massive amounts of data every day and can hallucinate back to you complex scenes, along with running everything in life. The AI foundation models that bring us things like Dall-e are going to look very quaint in a decade.</li>
</ol>
<p>Here are our predictions for the humanoid robot specifically in 2033:</p>
<ol class="wp-block-list">
<li>It will have much higher resolution imaging sensors (cameras, LIDARs, etc.) than today. By 2033, cameras on autonomous vehicles and robots will go from the 1K they are today to 32K. That means they can see further, and smaller, things. So now where it might have struggled to pick up a very small screw before, now it can see it without any problem. It also means a robot in an older autonomous vehicle will be able to “assist” the original vehicle and see further. </li>
<li>Most tasks in the home will be turned into services by then. By then it can even install many consumer electronics, or even a shower rack in the bathroom. </li>
<li>It takes over the management of the home (laundry, dishes, garbage, security, and monitoring and controlling all lights, appliances, vehicles, charging, and more). </li>
<li>At homes that have an electric car charging station, the robot will meet an incoming vehicle and plug it in for charging. This will make the Robotaxi system more resilient and let it get vehicles back on the road after a charge.</li>
<li>Robots will run many businesses that only cater to the automated vehicle network (making food that gets delivered to people’s homes, for instance).</li>
<li>An “air traffic control system” that runs the transportation as a service that Elon Musk calls “Robotaxi” will make sure robots and autonomous vehicles are sent to the right place at the right time. This is difficult because when there are large concerts, for instance, like Coachella, this control system will need to move thousands of cars from around the Western United States to Palm Springs to move people around there (we visited Uber’s effort at that festival to understand the traffic control and movement issues involved).</li>
<li>Humanoid robots used to be “janky” because they couldn’t do a wide variety of things well. Those days are gone – AI rapidly learned to get better.</li>
<li>Humanoid robots are very advanced with interacting with humans as compared to today. It won’t be unusual to have long, detailed conversations with your robot.</li>
<li>The network will be a lot smarter than today. Your robot will know everything that is happening across the world, in real time. It can read every single tweet. You certainly can’t. </li>
<li>The robot will enable new services in your home, like telepresence that goes way beyond what Zoom makes possible today.</li>
<li>Automatic shopping services are common. Consumers learned to trust their autonomous vehicles with their lives and, so, hand over shopping to the robot who, from that point on, always makes sure your refrigerator has milk for the kids.</li>
</ol>
<p>What really gets fun is when you mix robots, autonomous vehicles, together with Augmented Reality glasses. That brings effects that will be hard to predict. After all, when movie cameras and cinemas were invented how many more decades did it take for Star Wars to show up?</p>
<p>But we can see people getting a few robots to come over for Friday evening with their friends, and the robots will serve dinner and then will perform a short skit for entertainment after dinner. You’ll wear your Augmented Reality glasses and that will “dress up” your robot in different characters. Immersion is really improved when your robot hands you things while inside an immersive experience. This is how 2033 could be very weird compared to today.</p>
<p><strong><em>The Big Picture</em></strong></p>
<p>What are the possible larger impacts of the Tesla Optimus? With the Optimus comes an increase in home services spending, as well as an opportunity for Tesla to control the complete supply chain of products that Optimus uses in the home.</p>
<p>The increase in home services spending comes from consumers buying the services that Optimus can do – those services that a person does not have time to do, or just does not want to do. Optimus can serve the same kind of function that a housekeeper or maid has, but can handle more work at the same time and for a much longer period of time.</p>
<p>Additionally, Optimus can do things in the home that a housekeeper cannot, such as run diagnostics on major appliances and gauge how they are performing and if they are running efficiently. It could also do this for the Tesla car in a person’s garage, as well as ready it for use in the morning which is really useful to tired, hard-working families and professionals.</p>
<p>In addition to these functions, it could serve as a very energetic handyman, plumber, housepainter, etc. Doing all these services and replacing traditional professionals significantly changes the dynamics of the home services market. This disruption has the potential to substantially enlarge that market due to the efficiencies and superior attention to detail and physical strength of the Optimus. </p>
<p>In terms of how the Optimus would be made available to consumers, there would probably be several different channels for this. One possibility would be for a company to buy several Optimuses and rent or lease them out. Another would be direct purchases by upper class families, and a third way could be the buying of community Optimuses by home owners associations (HOAs), neighborhoods or cities.</p>
<p>In the process of its work, the Optimus will be using cleaning products and house improvement and handyman goods. For ease and scale, Tesla has the opportunity to make direct deals with the companies that provide these since it would need these in bulk. In this way, Tesla could control the complete supply chain of these products and goods for the Optimus; companies that make these products and goods would line up to be included since the sales volume would be so high.</p>
<p>While it is difficult to currently assess how big the potential market will be for the Optimus, it would encompass a large majority of upper middle and upper class people with families, as well as single, childless professionals, first in the U.S.and then, in other parts of the world.</p>
<p>The economic impact that the Optimus brings, taking into account even a mid-market penetration, will be significant. Why? Because the potential market is so big. Because the Optimus can do such a wide range of tasks, it will be relatively more efficient and will consolidate and increase the need for its services. The Optimus can go to a home and perform many varied services during that visit that would usually take four or five different kinds of workers. People who would not have been enticed before for certain home services would now take advantage of having those services done. Why not? If the Optimus can cook, mow the lawn, paint, babysit, diagnose electrical issues and so much more, it is very convenient for it to do many-varied tasks during any visit.</p>
<p>What is the impact on the workers the Optimus replaces? Yes, this has the potential of putting many different categories of service people out of business. Robotics and automation tend to have that effect in all kinds of areas of life. We don’t have an answer as to what will happen to the displaced workers, we only know that it will happen.</p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
</item>
<item>
<title>The First Tour of Giant AI’s Robot Lab</title>
<link>https://scobleizerblog.wordpress.com/2022/06/28/the-first-tour-of-giant-ais-robot-lab/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Tue, 28 Jun 2022 09:38:38 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9580</guid>
<description><![CDATA[Visiting Giant AI is like getting a tour of a secret lab that shouldn’t exist run by an eccentric genius. The kind of which we remember from “Back to the Future.” Adrian Kaehler is that genius. He built the computer vision system for the first autonomous vehicle that later became Waymo. He played a key … <a href="https://scobleizerblog.wordpress.com/2022/06/28/the-first-tour-of-giant-ais-robot-lab/" class="more-link">Continue reading <span class="screen-reader-text">The First Tour of Giant AI’s Robot Lab</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<p></p>
<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="embed-youtube"><iframe title="Tour of Giant AI's Robot Lab" width="1100" height="619" src="https://www.youtube.com/embed/ksyxVLsRlQ0?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div><figcaption>Visiting Giant AI</figcaption></figure>
<p>Visiting Giant AI is like getting a tour of a secret lab that shouldn’t exist run by an eccentric genius. The kind of which we remember from “Back to the Future.”<br><br>Adrian Kaehler is that genius. </p>
<p>He built the computer vision system for the first autonomous vehicle that later became Waymo. He played a key role in the early development of Magic Leap, an augmented reality company that just won best of show at the industry’s biggest gathering, AWE (for Augmented World Expo). He also wrote what many say is the book on Computer Vision which is still used by many computer science departments. Today he is leading the Giant AI company which is building humanoid robots that can work on manufacturing lines, doing the same jobs humans used to do and many new ones. Giant is invested in by Bill Gates and Khosla Ventures. </p>
<p>He saw the problems long ago that robots will bring. The earlier companies’ robots were designed and built to be very precise, which means they remain expensive today. You see many of these in factories today, they are heavy, don’t work well with humans, have to be programmed months in advance and are hard to retrain and don’t recover well when errors are made. Some are too dangerous to be around like the ones in Tesla’s factory in Fremont, which has some robots in cages to keep all humans away. </p>
<p>He also saw the solution: AI that builds a new kind of operating system. One that learns faster than anything most humans could imagine. It learns so fast that you only need to show it a few times how to do something and it’ll do that thing from then on. One that enables new lower-cost components to be used. Ones that are less precise</p>
<p>When I watch the Universal Worker move, I can see how the tendons that make it work create a very different, animal, sort of motion. It is kind of springy. This would be a non-starter for a traditional robot, but the AI control, just like with a person, manages this and makes it all work out. Dr. Kaehler tells me that the use of this sort of tendon system is central to how the robot can be so light and dexterous, as well as why it can be so much less expensive than traditional robots.</p>
<p> It’s the new AI that enables this new lower cost and safer approach. </p>
<p>So, getting into his lab first meant a lot to me. Why? I think it means a lot to you, too. </p>
<p>It means we will have to rethink work. From scratch.<br><br>Is your happiness and income coming from you pushing a button on a machine? Really? I worked on HP’s manufacturing line when I was a young man of 17. One of my first jobs was working the wave soldering machine there and shoving circuit boards into the wave, which instantly solderied the whole board. I had helped my parents and brothers hand build hundreds of Apple II. My mom taught us to solder. If you get good at it, like my mom was, you could do maybe a board in 30 minutes. I saw how manufacturing lines can change labor from my kitchen. My mom worked for Hildy Licht, who got hired by Apple to take on the task since they couldn’t make enough in its own factory. Apple cofounder Steve Wozniak, AKA “Woz,” told me that those boards had fewer failures than the ones made in its own factory. It also makes me Apple’s first child laborer (I was 13 at the time).<br><br>Anyway, I never wanted to do such a job again, given how boring it was. I loved that Wave Machine because it saved many hours of labor. I dreamed of a day when a robot would stuff the board too. I had to do that over and over and over again.<br><br>I wish I had a Universal Worker by Giant AI Corporation back then.</p>
<p>As he showed me around he was telling me what was making these robots so special. The AI inside is next level. See, I’ve been following AI since the very beginning. </p>
<p></p>
<p>I was the first to see Siri.<br><br>That was the first consumer AI app. I also have the first video, on YouTube, of the first Google Self Driving Car. Long before anyone else. That was the first AI on the road. I have been following AI innovators since the ve beginning.<br><br>This robot is using the next generation of all that.<br><br>Don’t worry, though.</p>
<p>You do get that we are in an exponential world, right? One where we need more workers, not fewer. Even if Giant got a huge funding deal, for, say, a billion valuation, it still couldn’t build enough robots to replace ANY human for MANY years. These are to fill in the gaps for when you can’t get enough workers to keep up with demand.</p>
<p>Anyway, back to the lab. Along each side I saw a row of robot prototypes for what Giant AI is calling “the Universal Worker.” Each was being tended to by staff, as Adrian gave me a tour he explained what each was doing. A new form of ML that uses neural radiance fields to see – the engineers are putting finishing touches on blog posts that will soon come going deep into the technology. In the video Kaehler goes into some depth about what it’s doing and how it works.<br><br>Each robot had a humanoid form. Even a smile on the face. And the head moved in a very unique way that I had never seen before. Strangely human like. Which, Adrian says in the video embedded here, is part of its ability to learn quickly and also get the trust of the human working aside it. It also lets it do a wider range of jobs than otherwise. It sees the machine, or task, it is standing in front of like we do – in 3D. And, in fact, there are many other similarities between what runs under robots, virtual beings, autonomous vehicles, augmented reality headsets and glasses. Kaehler is the only human that I know that has built three of those and he says that they all are strongly connected underneath in how they perceive the world, let others see the perceived and synthesized world.<br><br>As you get a look around his lab, you do see that they feel like early versions of the Tesla Autopilot system: a little rough and slow. Heck, even today, four years later, it does 6,000 more things, but it still seems a little rough and slow. The Universal Robots feel the same a bit to me. At least at first. Until I started watching that this was real AI learning how to grasp and drop things. It felt humanlike as it dropped a rod onto a machine yet another time in a row without dropping it. </p>
<p>I remember talking to the manager of the Seagate Hard Drive factory in Wuxi, China, about why he hired so many women. Nearly his entire final assembly line was women, highly trained too, I watched several run a scanning electron microscope. I never will forget what he told me: “Men drop drive off line, pick it up, put it back on line. Women don’t do that. They bring over to me and admit fault.”</p>
<p>This robot was learning quickly how to recover from its mistakes. Which is how it was designed, Adrian told me. It has grids of sensors in each finger, which can do a new kind of “feeling” than I’d ever seen a robot use before. Each of those sensors was being pushed and pulled by a cord going to a machine in the belly of its humanoid form. On the end of an arm that was built from cheap consumer processes. The hand shakes, just slightly, especially if a big forklift goes by. </p>
<p>Giant’s AI is what makes it possible to become far less expensive. It “sees” the world in a new way, using something the AI engineers call “Neural Radiance Fields.” A new form of 3D scenes that you can walk through. In Giant AI’s case it moves the hands through these radiance fields, which are unlike any 3D data structure we’ve ever seen before.<br><br>Its AI is constantly adopting and learning, which lets it figure out how to recover from a mistake very quickly. Adrian wrote the math formula on the board on a previous trip. It keeps pushing the hands toward the best possible outcome. So, you can slap them and they’ll recover. Or, if an earthquake hits and it drops your motor before it goes into the box it was supposed to put it in and the machine shakes. It still should be able to complete the task, just like a human would, or try to save the part, if possible, and if possible it will report a problem. </p>
<p>Anyway, at this point, you are wondering “why did you hype up Tesla’s robot so much?” Last week I did. Those who are inside the Tesla factory tell me that their simulator gives them an unfair advantage and will let them build a humanoid robot that can walk around and do a variety of tasks much faster than people are expecting. You’ll see Tesla’s robot in September as part of its AI day announcements. Yes, hardware is hard, even if you have the best simulators, it is getting easier.</p>
<p>In a way this is a David vs. Goliath kind of situation. So Giant had to focus on a very specific, but large enough, problem: one of low-skilled workers and what they need help with.<br><br>Which is why Giant’s Universal Robot doesn’t have legs. It isn’t a trillion dollar company. It can’t afford to put legs on a robot that doesn’t need them. A worker in a factory always stays in the same place and does the same job over and over and over.<br><br>It doesn’t spy on you the way that the Tesla robot will (Giant’s AI only can “look at” the work surface in front of it). It can’t walk around your factory floor mapping it out, or watching workers in other parts of your plant as it walks around.<br><br>It also doesn’t have a completed mouth, or a voice response system, or the ability to really communicate well with other human beings the way the Tesla robot will need to do. Which makes the Giant robot far cheaper than the Tesla ones and it ready now, at a speed slower than human, or soon, at same speed.</p>
<p>That said, Kaehler is keeping up to date on the latest computer vision research and he knows that Tesla’s will do many things Giant’s can’t, and that’s fine with him. He doesn’t have a car company to gather data about humans in the real world. It isn’t his goal to build a robot that can deliver pizza. Just do boring jobs that humans need an extra set of hands to help do.<br><br>Giant AI already has orders, Adrian says, but the company needs funding to get to the place where it can manufacture the robots themselves.<br><br>I remember visiting “Mr. China” Liam Casey. I visited him in his Shenzhen home and he gave me a once in six thousand lifetimes tour of Shenzhen that I treasure to this day. Then he took me on an even wilder one over his homeland of Ireland, where he took me to a research lab that Mark Zuckerberg ended up buying.<br><br>What did Casey teach me? He had the same problem. No one would invest in his business, even though he had customers. How did he get his orders done, I asked him “I got them built.”<br><br>“But how? Did you have something to trade? A house, an expensive car, secret photos, what?”<br><br>“My US Passport.”<br><br>The factory owner demanded his passport in trade for building his order. A form of collateral I’d never heard of before. Then had Casey travel the country to all his factories to do a certification on each. That led Casey to see the power of databases, particularly ones for tracking supply chains. Which is why he is Mr. China today, and makes many products in his PCH company that probably are in your home today. He used that early research about China’s factories to become the supply chain leader that many technology companies use to build their products.<br><br>Giant needs the same today. A way to get the product finished and manufactured. Capital, and lots of it, to get to where these are working hard to make everyone’s lives better.</p>
<p>Tesla’s simulator has ingested a lot more than just where has the car gone. It knows EVERYTHING about its owners. So, when an engineer wants to recreate a street, it is amazingly real and the people will even stop to say hello or let you check out their dog. Then you can make it rain. Or make it sunny. Over and over and over again. </p>
<p>Why is it so magical? BECAUSE OF the data the car and phone collects. A Tesla crosses the Golden Gate Bridge every 10 seconds. No one else is even in the same universe in data collection capabilities.<br><br>Tesla has a similar bleeding edge AI to Giant’s but Tesla’s has billions of times more data than Giant ever will get its hands on.<br><br>However, do you just need a machine to push a button or two every minute or two it notices a job is done or do you need Tesla’s AI and simulator that will have to do a whole lot more? No, at least not now, because the costs will be completely higher for the Tesla robot, which will need to walk and get in and out of autonomous vehicles.<br></p>
<p>That said, now that I’ve seen the Giant AI and how sophisticated it is with literally no data when compared to the Tesla system I realize that the Tesla one must be far more advanced and started asking around.<br><br>The Tesla robot will need to get out of an autonomous vehicle and figure out how to get a pizza up to your apartment, or to your front door, once you figure it out by talking to so many people, like I do. <br><br>Kaehler showed me a way how Giant’s AI would do that if it had access to the Tesla data and resources, particularly its simulator where rafts of people can “jump into” and walk around keep training over and over teaching it to get it right. The demos you see in the video are quaint compared to what the resources of Tesla can generate, as impressive as they are.<br><br>Every day I’m more and more convinced I’m conservative. Either way, getting the first look at Giant’s Universal Worker gives us a good look at the future of work so I hope you appreciate being first to see this. I sure did. </p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
</item>
<item>
<title>When will augmented reality glasses be ready for consumers?</title>
<link>https://scobleizerblog.wordpress.com/2022/05/31/when-will-augmented-reality-glasses-be-ready-for-consumers-really/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Tue, 31 May 2022 22:18:48 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9525</guid>
<description><![CDATA[Unfortunately it has taken a lot longer to get augmented reality glasses to consumers than I expected. A lot longer. I thought we would be wearing them all day by now. Heck, when I got Google Glass years ago I thought I never would take those off. Boy, was I wrong. Many in Silicon Valley … <a href="https://scobleizerblog.wordpress.com/2022/05/31/when-will-augmented-reality-glasses-be-ready-for-consumers-really/" class="more-link">Continue reading <span class="screen-reader-text">When will augmented reality glasses be ready for consumers?</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<p>Unfortunately it has taken a lot longer to get augmented reality glasses to consumers than I expected. A lot longer. I thought we would be wearing them all day by now. Heck, when I got Google Glass years ago I thought I never would take those off. </p>
<p>Boy, was I wrong. <br><br>Many in Silicon Valley taunt me for my previous optimism, saying “not this year, not next year.” <br><br>That doesn’t mean they aren’t getting closer. For the past seven years I’ve been watching <a href="https://lumusvision.com/">Lumus</a>, a small company in Israel, that makes the best lenses/displays I’ve seen so far. Every two years they come and visit me and show me its latest. Every year they get brighter, lighter, more efficient, smaller, and more. <br><br>Here, in video, is Lumus’ head of marketing showing me its latest displays and you see just how big an improvement it has made. You can see these are getting much closer to the size and quality that consumers will be happy wearing.</p>
<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="embed-youtube"><iframe title="The future of augmented reality glasses with Lumus" width="1100" height="619" src="https://www.youtube.com/embed/wLN5QNJj_aY?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>
<p>But since I have been so wrong before, I wanted to take a more sober look at these displays and ask myself “when will consumers buy these?”<br><br>That may just be the wrong question. Unless I was working at Meta or Apple or Snap. </p>
<p>Enterprise uses of these are coming right now. Just look at the revolution in robotics that is underway. AI pioneer <a href="https://twitter.com/adrian_kaehler">Adrian Kaehler has been retweeting every amazing robot on Twitter</a> (he is CEO of <a href="https://www.giant.ai/">Giant AI</a>, which makes manufacturing robots coming over the next year) and there are dozens that work on all sorts of production lines, not to mention do a variety of other jobs. These glasses would be perfect for controlling, and training, all these new robots. And a variety of other things, from training to surgery. This is why Magic Leap has a new shot at life that I also didn’t see, due to its cord and lack of consumer experiences.<br><br>Other augmented reality companies have pivoted away from consumers and toward enterprise uses of these glasses and devices (most notably Magic Leap and Microsoft’s HoloLens). </p>
<p>Why? </p>
<p>Well, for instance, look at some of the limitations of even these amazing new displays from Lumus. While they are many times brighter than, say, the Magic Leap or HoloLens displays, and have bigger fields of view, the image does not quite match my 4K TV, which cost me $8,000 last year. </p>
<p>So, consumers who want to watch TV, or particularly movies, in their glasses will find the image quality still not as nice as a bleeding edge TV, tablet, or phone display (although inside they are damn close). Even though augmented reality glasses give many other advantages (like you can watch in a plane or while walking around, something my big TV can’t do). But these are dramatically better than they were last time I saw Lumus’ latest. White whites. Sharp text. Bright videos and images.<br><br>The field of view, too, is 50 degrees. OK, that does match my 83-inch TV when I am sitting on my couch (the image in the Lumus is actually bigger than my TV slightly) but that isn’t enough to “immerse” you the way VR does. Will that matter to consumers? I think it will, but 50 degrees is way better than what Snap is showing in its current Spectacles. In 2024’s devices screens will be virtualized, too, so the hard field of view numbers won’t matter nearly as much. These are certainly better than my HoloLens’s field of view.<br><br>Also, bleeding edge TVs, like my Sony OLED, have better color and luminance depth. What does that mean? TV and movies still look better on my TV. But that, also, is sort of a bad comparison. My TV can’t travel with you. These displays are pretty damn good for a variety of uses, I just wish I didn’t need to wait until 2024 to get them.</p>
<p>This is why many who are working on Apple’s first device tell me it is NOT doing see-through glasses like these for its first product. They just don’t match consumer expectations yet (although these Lumus lenses are a lot closer than any others I’ve seen so far). </p>
<p>Apple’s first device is what those of us in the industry call a “passthrough” device and is NOT a pair of glasses like what Lumus is showing here. In other words, you can’t see the real world through the front of the device. Unless the device is on (Apple’s device will present a digital recreation of the room you are in — I hear its new version of augmented reality is pretty mind blowing, too). </p>
<p>Until this next generation of devices happens these glasses will mostly be used for R&D or enterprise uses, like controlling robots or production lines, or doing things like surgery, where field of view, brightness, etc aren’t as important as they will be for consumers. Lumus is selling their much better lenses to consumer-focused partners, but they don’t expect the really interesting glasses until 2024.</p>
<p>I’ve been working with a variety of enterprise users and here there is a deep hunger for better glasses. At Trimble, a construction company, for instance, they are working on a variety of initiatives. They are using the Boston Dynamics’ robots to map out construction sites in 3D and then using HoloLenses to do a variety of tasks. The problem? The HoloLens only has displays that are about 400 nits. Technical term for “dim, poor quality color, very little readability in bright sunlight.” Lumus’ displays are 5,000. Yesterday I took them outside and saw that they are plenty bright enough for bright environments. </p>
<p>Also, the HoloLens is very heavy and big compared to the glasses that Lumus and many others are readying. The construction workers are not happy with the size of the HoloLens, or even the Magic Leap, which has a cord down to a computer that clips on your belt. </p>
<p>These enterprise users are hungry to buy a decent set of augmented reality glasses. Lumus should help its partners get to these markets long before Meta, Snap, or Apple figure out how to get consumers to want to buy glasses. <br><br>How will I evaluate whether the market is ready?<br><br>Let’s make a list.<br><br>1. <strong>Brightness. </strong>2,500 nits is perfect for most enterprise uses (HoloLens is only 400 and all my clients complain about lack of brightness and visual quality). Lumus says theirs can do 5,000, which gets close to consumer expectations. Big improvements over the past and over competitors I’ve seen. </p>
<p>2. <strong>Color. </strong>The Lumus lenses are much better than others I’ve seen. Pure whites and decent color (I could watch TV and movies in them). Enterprise is ready. Will consumers take to these in 2024? I think so. No color fringing like I see on my HoloLens. Much much nicer.</p>
<p>3. <strong>Size.</strong> The projectors in the Lumus are much smaller than they were three years ago when I last saw Lumus’ work. Very awesome for doctors, construction workers, production line workers, etc but still a bit too big for “RayBan” style glasses. But I could see wearing these for hours.</p>
<p>4. <strong>Cost. </strong>They avoided this question, sort of, but the cost is now coming down to enable devices that are $2,000 or less. That is acceptable for many enterprise uses, but still too high for most consumers. That said, I’m wearing glasses that cost $1,500 before insurance, so we are heading to consumer pricing.</p>
<p>5. <strong>Battery life and heat generation.</strong> Here Lumus has made big strides. They claim devices that are running their latest projectors will be able to go for hours, even all day, depending on how often the displays are showing information. That is great for, say, a surgeon, using a system like the one MediView makes. They only need displays on for a few minutes during surgery. Same for many other enterprise uses. Most workers won’t be trying to watch live streaming video all day long, like consumers will be. Also, they don’t heat up like others on the market do. But for consumer uses? Not quite there yet. Consumers will want to watch, say, CNBC all day long, along with working on screens of information. </p>
<p>6. <strong>Field of view. </strong>Yes, it’s better than my expensive 83-inch TV, but not by much. Consumers will have higher expectations than just 50 degrees. Enterprise users? Don’t care much at all. The benefits of having screens on their eyes outweighs the lack of wrap-around screens. </p>
<p>7. <strong>Content. </strong>Consumers will want to do everything from edit spreadsheets to watch TV shows and movies and play video games. All of which Lumus will never do, so its partners will need to come up with all of that. Enterprise users are far more focused on very specific use cases, like controlling a robot, or being able to see data on production machinery. That’s a hard job, for sure, but a far easier one than getting the wider range of things consumers expect done. Yes, the Metas, Apples, Googles, Snaps, Niantics, etc, are working on all that but they aren’t nearly ready with enough to get consumers to say “wow.” </p>
<p>8. <strong>Resilience. </strong>Consumers will want to wear these devices out in the rain. Will drop them. Their kids will step on them. How do I know? All that has happened to my glasses, which I’m forced to wear simply to see. Enterprise users are more focused on safety and many jobs, like surgery, will not need nearly the same kind of resilience that consumers will need. </p>
<p>Now, can all these problems be fixed by, say, an Apple or a Meta or a Snap? Sure, but I bet on Apple being more aggressive and that didn’t happen. So, we need to see how well it does next year with a launch of a bigger, heavier device aimed at home users to see how well consumers react to augmented reality devices on our faces.</p>
<p>Now, is there someone out there that has glasses ready to go sooner? Maybe, but, let’s say NVIDIA has a pair that does a lot, will they have all the advantages of Apple? No way. Not for a while. </p>
<p>This is why Mark Zuckerberg told investors that it will be “years” before augmented reality devices make big money with consumers. Even its VR efforts, after being out for seven years, and having a ton of content and low price of $300, is only selling about 1.5 million units a quarter (Apple sells that many phones in about two days). </p>
<p>Translation: as excited as I am about going to this week’s Augmented World Expo, we still have a couple of years to go, at minimum. I’m bummed writing that, but it’s better to be more realistic about the near future than optimistic. </p>
<p></p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
</item>
<item>
<title>As blind honor Apple accessibility pioneer my son shows far more work is ahead</title>
<link>https://scobleizerblog.wordpress.com/2022/03/28/as-blind-honor-apple-accessibility-pioneer-my-son-shows-far-more-work-is-ahead/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Mon, 28 Mar 2022 18:38:41 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9515</guid>
<description><![CDATA[It didn’t shock me that MojoVision (a Silicon Valley startup making augmented reality contact lenses) brought a big percentage of their team and had a table right in the middle of Sight Tech’s event honoring Mike Shebanek for his work on Apple’s VoiceOver functionality that enables blind people and those who have vision impairments use … <a href="https://scobleizerblog.wordpress.com/2022/03/28/as-blind-honor-apple-accessibility-pioneer-my-son-shows-far-more-work-is-ahead/" class="more-link">Continue reading <span class="screen-reader-text">As blind honor Apple accessibility pioneer my son shows far more work is ahead</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="embed-youtube"><iframe title="Mike Shebanek honored by Vista (he built accessibility features at Apple)" width="1100" height="619" src="https://www.youtube.com/embed/79VJeUdkKZY?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>
<p>It didn’t shock me that MojoVision (a Silicon Valley startup making augmented reality contact lenses) brought a big percentage of their team and had a table right in the middle of Sight Tech’s event honoring Mike Shebanek for his work on Apple’s VoiceOver functionality that enables blind people and those who have vision impairments use iPhones. All around the audience were blind people. <br><br>MojoVision’s CEO, Drew Perkins, had cataracts and eye surgery, and has long sought to build a bionic eye. So, it makes sense MojoVision would align themselves with the blind community. But all around were others working on augmented reality products. Meta, Apple, and others. </p>
<p>While Shebanek’s speech will be interesting to any Apple fan (he gives lots of stories about building an accessibility team at Apple, including lots of Steve Jobs stories) I don’t want you to miss what happens about 57 minutes into my video: <a href="https://youtu.be/79VJeUdkKZY?t=3431">several of the blind people around the room were called on to tell what Apple’s VoiceOver meant to them</a>. </p>
<p>The stories are heartwarming but the job isn’t done. Why do I say that? My 14-year-old son is a special needs kid and has speech that is hard to understand by many and is autistic. None of the AI voice systems understand him and you should hear his frustration at not being able to communicate with computers like his brother can by talking to Alexa or Siri. He’s had Apple devices since he was two years old.<br><br>He can’t use systems like Apple’s Siri, Amazon’s Alexa, or even Google’s Assistant with his voice. They just don’t understand him. <br><br>As we move into augmented reality devices, which could greatly help him live his, and those who are like him, life, these technology walls grow more daunting. Why? Five years from now we will be talking to AIs far more frequently than today. <br><br>At his public school his special needs classmates have similar problems. Some can’t hold their hands still enough to type on a keyboard. Many have a tough time with speech. </p>
<p>Will my son and his fellow students be included in the next paradigm shift? The paradigm shift of moving to 3D computing and new user interfaces for using your real voice and real hands in. For some users, like my son, this will be a frustrating paradigm shift.<br><br>It was an honor hearing Mike Shebanek’s stories. He’s a real pioneer who has had a deep mark on many companies (he now is working at Meta). He gives me hope that my son, and his fellow students, will be included in the computing platform of the future.<br><br>Thanks to the Vista Center for inviting me. </p>
<p>The Vista Center empowers individuals who are blind or visually impaired to embrace life to the fullest through evaluation, counseling, education, and training. Learn more: <a rel="noreferrer noopener" href="https://www.youtube.com/redirect?event=video_description&redir_token=QUFFLUhqbHdpcENUeVkwbmJLS2MxTVNhT2pnZndJTDJEd3xBQ3Jtc0tuSklvc0JidWNFaU5pNzFMbjM1N2ZTWi01eGE5c3pDRjhSdHU3S0JZbGZLelZhVktuLVBaYy1PVElfMFRVWHJCb1RhWnRQYzM0REhxZDJGV1phYVhTTVBHcnpMUjZVVGRXWVVMMVAwWENqRW1ma0ZESQ&q=https%3A%2F%2Fvistacenter.org" target="_blank">https://vistacenter.org</a> </p>
<p>It has a conference coming in December, 2022, for developers who are shaping new technologies to create a more accessible world for people who are blind. Details on that here: <a rel="noreferrer noopener" href="https://www.youtube.com/redirect?event=video_description&redir_token=QUFFLUhqbFhjS05iRVR0YnJmOFl6WmdiN0Qtb0NmNUhMQXxBQ3Jtc0trUGtFMTU1aVVZRXl1UDZ4d2I5YjVXZ014Q0tvTXNXUUdmT3pyVGRwRFNLYkZaTFU2dnpVbUdiOXZZdHRGWDdjVkFTdDY0elp5RW1OWElLbjljZmI5LUs0SXUtV3M5ejRDaWxXRXVlYmIyR3UtdG9IVQ&q=https%3A%2F%2Fsighttechglobal.com" target="_blank">https://sighttechglobal.com</a></p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
</item>
<item>
<title>Future proof your playlists with these HUGE Dolby Atmos music lists</title>
<link>https://scobleizerblog.wordpress.com/2022/02/24/future-proof-your-playlists-with-this-huge-dolby-atmos-music-list/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Thu, 24 Feb 2022 20:29:52 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9455</guid>
<description><![CDATA[UPDATE August 27, 2022. I’ve been focusing only on Apple lately. Amazon’s user experience really sucks for people trying to curate massive playlists and it’s just too difficult to keep all my lists synched. My Apple lists are here: https://music.apple.com/profile/AllDolbyAtmos As Apple puts the finishing touches on its new augmented reality headset, expected later this … <a href="https://scobleizerblog.wordpress.com/2022/02/24/future-proof-your-playlists-with-this-huge-dolby-atmos-music-list/" class="more-link">Continue reading <span class="screen-reader-text">Future proof your playlists with these HUGE Dolby Atmos music lists</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<p><em>UPDATE August 27, 2022. I’ve been focusing only on Apple lately. Amazon’s user experience really sucks for people trying to curate massive playlists and it’s just too difficult to keep all my lists synched. My Apple lists are here: <a href="https://music.apple.com/profile/AllDolbyAtmos">https://music.apple.com/profile/AllDolbyAtmos</a> </em></p>
<p>As Apple puts the finishing touches on its new augmented reality headset, expected later this year, I’ve been tracking innovation in music. Spatial Audio/Dolby Atmos. Why? Dolby Atmos will be a huge part of the announcements Apple is going to make. It will also be very important in the future of the “metaverse.”<br><br>Last year we got a new Sonos system that plays Dolby Atmos (new spatial audio/surround sound/better quality). Since I sold audio gear in the 1980s it’s amazing to me that you can feel like you are in the middle of a concert now. Apple’s headphones, which I also have, also support Dolby Atmos but don’t really get you the surround sound or the bass of our $3,800 Sonos system.<br><br>While watching group forums on Facebook and elsewhere I see lots of others are getting new audio systems that play Dolby Atmos. Movies have played Atmos for years, but music services started sharing Atmos less than a year ago. </p>
<p>The problem is finding Dolby Atmos music. </p>
<p>For instance, <a href="https://music.apple.com/us/playlist/rock-in-spatial-audio/pl.a82d7ac0ee854d8b8a95708c76210023">Apple’s “Rock Spatial Audio” list has 99 songs</a>. Nice start, but I got bored very quickly. So I started collecting my own. My rock list has 1,131 songs and my hard rock list has 166 songs. Finding these are very difficult. Why? Some albums only have one song done in Dolby Atmos. So you gotta go one by one through each song and you need to know where to look to find new ones. </p>
<p>None of the services are doing Dolby Atmos fans, like me, justice. I’m on all of them that support Atmos (Tidal, Amazon, and Apple) and even some that don’t support Atmos (like Spotify and YouTube Music). </p>
<p>It makes you wonder why the music industry is hiding its biggest technology advance in decades? When it comes to Apple, I’m pretty sure it is readying their own Dolby Atmos music service for its new headset. But Amazon? Its UI is horrid. Worse, all services have really shitty search engines. </p>
<p>Anyway,<a href="http://scripting.com"> Dave Winer</a> regularly writes that blogs let authors route around big companies. This is exactly what is going on here. Now, I know 99.99% of people don’t care. That’s fine. You will when you get new surround sound headphones next year. If you still are reading, just remember that this post exists so when you do start to care about music quality you have a resource to go to. <br><br>Anyway, here’s the master list of my playlists. I’m breaking them into two sections: “curated” and “catalog.” Curated means I built the list after listening to every song. I built these for my own home and are what I listen to every day. Catalog means it’s just a list of everything I can find (like my rock lists) without any concern about the quality). </p>
<p>If you use these, you must see the Dolby Atmos logo. If you don’t see a logo when playing then you aren’t getting the full Atmos experience (you might need to turn it on in your phone’s settings, or upgrade your equipment).<br><br>So, let’s start with “curated.” The first link is to Apple Music. Amazon has a lot less music in Dolby Atmos format and I have only moved over some of my playlists (they take hours to move over because Amazon has far less Atmos). </p>
<p>I include the link here because Amazon sounds better than Apple. Even on Apple’s own headphones. Why? Because it is using a new version of Dolby Atmos that Apple and Tidal aren’t yet using.<br><br>1. <a href="https://music.apple.com/us/playlist/chill-together/pl.u-pMBMc4B5JBe">Chill Together</a>. 242 songs. This is music that Maryam (I’m her husband) and I like listening together to. Nice and calm music. </p>
<p>2. <a href="https://music.apple.com/us/playlist/dolby-atmos-nightclub/pl.u-pMrmu4B5JBe">Dolby Atmos Nightclub</a>. 451 songs. The opposite of Chill Together. Lots of explicit language and mostly hip hop/rap. Loud, obnoxious. Rattles the subwoofers. (<a href="https://music.amazon.com/user-playlists/31a5bcd09d804becb3c466baa47750c3sune?ref=dm_sh_509d-9b27-90ef-3458-7ce1c">Amazon</a>)</p>
<p>3. <a href="https://music.apple.com/us/playlist/dolby-atmos-party/pl.u-r2GktP2eX2z">Dolby Atmos Party</a>. 373 songs. None of the explicitness of the nightclub, but still fun beats to get people dancing. (<a href="https://music.amazon.com/user-playlists/acd837ee3d714c68801db1937bf342c5sune?ref=dm_sh_7e9a-ec19-a838-db59-24aed">Amazon</a>)</p>
<p>4. <a href="https://music.apple.com/us/playlist/dolby-atmos-radio/pl.u-4JW2Fa4XL4l">Dolby Atmos Radio</a>. 1,674 songs. Music that’s great to listen to all day long. No explicit stuff, but a wide variety of songs. (<a href="https://music.amazon.com/user-playlists/6e0756c771d141d3a19800b69cd5a506sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_yNHLgVpuzSUNRezfN2EkXgzrw">Amazon</a>)</p>
<p>5. <a href="https://music.apple.com/us/playlist/dolby-atmos-speaker-demonstrations/pl.u-0JpAIWq7gqm">Dolby Atmos Speaker Demonstrations</a>. 91 songs. The best of the best. I did this list to show family and friends what Dolby Atmos is all about but I found it’s great to keep going back to whenever the software in my speaker system upgrades. (<a href="https://music.amazon.com/user-playlists/e202c2c340fa4087abe899b6724959e2sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_hg2rgnkgDVKfYkelkUIHqUSkN">Amazon</a>)</p>
<p>6. <a href="https://music.apple.com/us/playlist/favorites/pl.u-yZ7mFY0ZK0o">Favorites</a>. 1,254 songs. Similar to Dolby Atmos Radio but with a little higher quality level. </p>
<p>7. <a href="https://music.apple.com/us/playlist/holiday-party/pl.u-55Y6s8NX5NL">Holiday Party</a>. 102 songs. My favorite Christmas/holiday music. (<a href="https://music.amazon.com/user-playlists/3900258395d74d26a92e76ef67987bffsune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_ONpTzU2IagZEs1qllB2O6nM7b">Amazon</a>)</p>
<p>8. <a href="https://music.apple.com/us/playlist/quiet-beauty/pl.u-zPZLtZ6806L">Quiet Beauty</a>. 165 songs. Very quiet instrumental music. Great for having in the background while working or reading. (<a href="https://music.amazon.com/user-playlists/b1ff29279a904dc8877028585a3aed65sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_Yplm3EYIPjblA6zJY8P4qpSk5">Amazon</a>)</p>
<p>9. <a href="https://music.apple.com/us/playlist/timeless/pl.u-EdW3Ia2qY2A">Timeless</a>. 398 songs. The music that we can listen to for decades and not get tired of listening to. (<a href="https://music.amazon.com/user-playlists/33915a5978824fde9a8884ea6ff42f91sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_uEtsE1qv7iLsJGHgakvt9uC8A">Amazon</a>)</p>
<p>10. <a href="https://music.apple.com/us/playlist/vibe-alignment/pl.u-55G2s8NX5NL">Vibe Alignment</a>. 432 songs. Nice songs to listen to with other people in the room. (<a href="https://music.amazon.com/user-playlists/b675198c7e8e4390ad2ee7a53f47e846sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_0JODmJH6j0ynYoR1JiqimpkaX">Amazon</a>)</p>
<p>The rest is what I call “catalog.” In other words, genres or other things that don’t have editorial input from me. Here I go for completeness, not quality. Usually I try to stay with Apple’s own categorization.</p>
<p>11. <a href="https://music.apple.com/us/playlist/african/pl.u-E2VCa2qY2A">African</a>. 158 songs. </p>
<p>12. <a href="https://music.apple.com/us/playlist/alternative/pl.u-Erqta2qY2A">Alternative</a>. 3,132 songs. (<a href="https://music.amazon.com/user-playlists/e8ea6e2a247a4a498e6efa820c837afbsune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_i5c2I9NdSaBOkmYG9bZjRM4Uu">Amazon</a>)</p>
<p>13. <a href="https://music.apple.com/us/playlist/blues/pl.u-EdgGsa2qY2A">Blues</a>. 61 songs.</p>
<p>14. <a href="https://music.apple.com/us/playlist/bollywood/pl.u-MDR3CWx3Mxd">Bollywood</a>. 280 songs.</p>
<p>15. <a href="https://music.apple.com/us/playlist/catalog-a-100-dolby-atmos/pl.u-zpevTZ6806L">Catalog A</a>. (everything I can find that has last name of “A”). 1,417 songs. (<a href="https://music.amazon.com/user-playlists/aaf04c7674c24d858acd9a7ca6375099sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_pVvqEP8RFrNX9i7wz2x7GJjT2">Amazon</a>)</p>
<p>16. <a href="https://music.apple.com/us/playlist/catalog-a-m-explicit-100-dolby-atmos/pl.u-47v3Ha4XL4l">Catalog A-M</a>. (Explicit). 2,935 songs. (<a href="https://music.amazon.com/user-playlists/498183f40ac34b3a883089daacc6e72asune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_OwfjkFmnkb1eWsloLjg5QpSJb">Amazon</a>)</p>
<p>17. <a href="https://music.apple.com/us/playlist/catalog-b-c-100-dolby-atmos/pl.u-y2lxsY0ZK0o">Catalog B-C</a>. 4,316 songs. (<a href="https://music.amazon.com/user-playlists/b838739aeb47403e9c541f76ac72db82sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_Y5XzIkddlgyiJb5AnypDT6oAw">Amazon</a>)</p>
<p>18. <a href="https://music.apple.com/us/playlist/catalog-d-f-100-dolby-atmos/pl.u-Xz9ZFDmgPmJ">Catalog D-F</a>. 2,318 songs. (<a href="https://music.amazon.com/user-playlists/11b2553035a14b9089b4991894cc2b3dsune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_mm0m5J5v9J9h8wqgjQQxezSXW">Amazon</a>)</p>
<p>19. <a href="https://music.apple.com/us/playlist/catalog-g-j-100-dolby-atmos/pl.u-XzyJuDmgPmJ">Catalog G-J</a>. 3,715 songs. (<a href="https://music.amazon.com/user-playlists/e4a59d1bf8074917bbcdd218607bff3asune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_YCvanbkto0iQUraVG7SVRosts">Amazon</a>)</p>
<p>20. <a href="https://music.apple.com/us/playlist/catalog-k-100-dolby-atmos/pl.u-qp3Mu2jdmjM">Catalog K</a>. 886 songs. (<a href="https://music.amazon.com/user-playlists/ea9da1fc1b364e47b0a21038b7ca3a4fsune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_9DueD3CjBdc6yES0jhBa889CU">Amazon</a>)</p>
<p>21. <a href="https://music.apple.com/us/playlist/catalog-l-m-100-dolby-atmos/pl.u-y293TY0ZK0o">Catalog L-M</a>. 4,042 songs. (<a href="https://music.amazon.com/user-playlists/eee8ca6a12c8400da601b591afab6fabsune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_fOPiJD5aFnvzstmR32UElxNLD">Amazon</a>)</p>
<p>22. <a href="https://music.apple.com/us/playlist/catalog-n-p-100-dolby-atmos/pl.u-y2KXsY0ZK0o">Catalog N-P</a>. 2,229 songs. (<a href="https://music.amazon.com/user-playlists/e595d226465e4940aba18b1994b08f01sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_aRTBdQUqnTj9WNSNJD3DFTHDo">Amazon</a>)</p>
<p>23. <a href="https://music.apple.com/us/playlist/catalog-n-z-explicit-100-dolby-atmos/pl.u-qp16t2jdmjM">Catalog N-Z Explicit</a>. 1,870 songs. (<a href="https://music.amazon.com/user-playlists/a5244338cd63413c9589672d066fb8d0sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_HA8Q5i3l1DLbZ4XPl5MuyH01M">Amazon</a>)</p>
<p>24. <a href="https://music.apple.com/us/playlist/catalog-q-s-100-dolby-atmos/pl.u-zpd5tZ6806L">Catalog Q-S</a>. 3,050 songs. (<a href="https://music.amazon.com/user-playlists/599762d3bbf94d0084a82d3e1bf09659sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_05UnCsHPF7GsBXFftzGnpuQWx">Amazon</a>)</p>
<p>25. <a href="https://music.apple.com/us/playlist/catalog-t-v-100-dolby-atmos/pl.u-qpd5T2jdmjM">Catalog T-V</a>. 1,667 songs. (<a href="https://music.amazon.com/user-playlists/714343044f2f4959969c220c2d19134fsune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_PgfvJ3jXb0ahQWodLSEtlkWov">Amazon</a>)</p>
<p>26. <a href="https://music.apple.com/us/playlist/catalog-w-z-100-dolby-atmos/pl.u-47VBIa4XL4l">Catalog W-Z</a>. 1,568 songs. (<a href="https://music.amazon.com/user-playlists/6cc3678dc70b4996915abea8203f308dsune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_XzuysoItmCzelwpHiE1dla7W0">Amazon</a>)</p>
<p>27. <a href="https://music.apple.com/us/playlist/childrens-music/pl.u-06JguWq7gqm">Children’s Music</a>. 99 songs.</p>
<p>28. <a href="https://music.apple.com/us/playlist/church/pl.u-XkzRiDmgPmJ">Church</a>. 362 songs. Religious.</p>
<p>29. <a href="https://music.apple.com/us/playlist/classical/pl.u-p7aI4B5JBe">Classical</a>. 9,550 songs. (<a href="https://music.amazon.com/user-playlists/cbee139e9a9a4543959e66942cd39500sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_iFZM0VqdReKd97yTGIHWby0bA">Amazon #1</a>. <a href="https://music.amazon.com/user-playlists/0eea386b742c4a02b343d0b224a560a3sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_Hn3XYjexC6zXfITmf32N6cmgO">Amazon #2</a>)<br><br>29B: <a href="https://music.apple.com/us/playlist/classical-crossover/pl.u-r09YTP2eX2z">Classical Crossover</a>. 1.083 songs. (Amazon)</p>
<p>30. <a href="https://music.apple.com/us/playlist/country/pl.u-r25YTP2eX2z">Country</a>. 1,400 songs. (<a href="https://music.amazon.com/user-playlists/f90bca689fc5490796f80240f540f37dsune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_78y7MTaYgpYIgqS7fprxGOzdv">Amazon</a>) </p>
<p>31. <a href="https://music.apple.com/us/playlist/dance/pl.u-EdPeua2qY2A">Dance</a>. 856 songs. (<a href="https://music.amazon.com/user-playlists/a9a8e6aede02437b9006b2a667ebbdd1sune?ref=dm_sh_36d1-8c2a-d1d8-d9c1-83897">Amazon</a>)</p>
<p>33. <a href="https://music.apple.com/us/playlist/drinking/pl.u-pMLLC4B5JBe">Drinking</a>. 51 songs.</p>
<p>34. <a href="https://music.apple.com/us/playlist/electronic/pl.u-MDB6tWx3Mxd">Electronic</a>. 711 songs. (<a href="https://music.amazon.com/user-playlists/804667f7094347b9b197fab108968137sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_2k8dVgBemF2lkizAsx9TbmXko">Amazon</a>)</p>
<p>35. <a href="https://music.apple.com/us/playlist/funk-r-b-soul/pl.u-062zFWq7gqm">Funk, R&B, & Soul</a>. 1,516 songs. (<a href="https://music.amazon.com/user-playlists/bbeb5ccfcc5a4c558cedb10798c0ef40sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_47dc5Thac1xJAs3qXMCej545H">Amazon</a>)</p>
<p>36. <a href="https://music.apple.com/us/playlist/hard-rock/pl.u-55mqU8NX5NL">Hard Rock</a>. 280 songs. (<a href="https://music.amazon.com/user-playlists/9d0875a05c8144378a0b357887fbe51dsune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_XsrGcj8TE1t9gGRpcVAoPLDy4">Amazon</a>)</p>
<p>37. <a href="https://music.apple.com/us/playlist/hip-hop-rap/pl.u-XkmLiDmgPmJ">Hip-Hop & Rap</a>. 6,259 songs. (<a href="https://music.amazon.com/user-playlists/5f779d5f7b3e4d5da5b741005b2d86acsune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_VsYtFE0vw6uh4eYdpbd45m4ab">Amazon</a>)</p>
<p>38. <a href="https://music.apple.com/us/playlist/holiday/pl.u-zPvgTZ6806L">Holiday</a>. 647 songs. </p>
<p>39. <a href="https://music.apple.com/us/playlist/jazz/pl.u-rE2sP2eX2z">Jazz</a>. 581 songs. (<a href="https://music.amazon.com/user-playlists/b8c78afaf07c405e8c0c4420c7431edcsune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_nWZM1ieGLM8LOjzWKycGxw9AK">Amazon</a>)</p>
<p>40. <a href="https://music.apple.com/us/playlist/latino/pl.u-qx0Jt2jdmjM">Latino/Mexican</a>. 1,246 songs. (<a href="https://music.amazon.com/user-playlists/11e82fe70001413abddacba25ad4293csune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_QKxQwZS9LAjcMPtiSzpIYKkH3">Amazon</a>)</p>
<p>41. <a href="https://music.apple.com/us/playlist/meditation/pl.u-r2pXFP2eX2z">Meditation</a>. 30 songs.</p>
<p>42. <a href="https://music.apple.com/us/playlist/new-dolby-atmos-friday/pl.u-pMLjT4B5JBe">New Dolby Atmos Friday</a>. (Changes every day as new music comes out. I keep music here for about a week).</p>
<p>43. <a href="https://music.apple.com/us/playlist/pop/pl.u-06vWuWq7gqm">Pop</a>. 4,116 songs. (<a href="https://music.amazon.com/user-playlists/278813b0e6ae469ba7dca9cd478df2besune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_waOunspyqvIiOwjcNUGtMtue0">Amazon</a>)</p>
<p>44. <a href="https://music.apple.com/us/playlist/reggae/pl.u-XkY0cDmgPmJ">Reggae</a>. 59 songs.</p>
<p>45. <a href="https://music.apple.com/us/playlist/rock/pl.u-ELWFa2qY2A">Rock</a>. 1,571 songs. (<a href="https://music.amazon.com/user-playlists/8127db9fb82544b9bc5e27cd86185cffsune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_vSJfwESUG4aN5PdHHmnzsEYN7">Amazon</a>)</p>
<p>46. <a href="https://music.apple.com/us/playlist/singer-songwriter/pl.u-yZPlFY0ZK0o">Singer/Songwriter</a>. 193 songs. (<a href="https://music.amazon.com/user-playlists/72a5468effb54955a84b5ac1c565b797sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_yOSdMXbZQWec6vL6mmvZ1FL6y">Amazon</a>)</p>
<p>47. <a href="https://music.apple.com/us/playlist/soundtrack/pl.u-zP1XIZ6806L">Soundtrack</a>. 786 songs. (<a href="https://music.amazon.com/user-playlists/da7aa3664b664129ba9d7f58fc0f4ad0sune?marketplaceId=ATVPDKIKX0DER&musicTerritory=US&ref=dm_sh_YqwYlJBa9qxjxiM79w6iDdB2u">Amazon</a>)</p>
<p>48. <a href="https://music.apple.com/us/playlist/worldbeat/pl.u-06l6uWq7gqm">World</a>. 322 songs.</p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
</item>
<item>
<title>New startup mixes reality with computer vision and sets the stage for an entire industry</title>
<link>https://scobleizerblog.wordpress.com/2022/02/17/invisible-digital-twin-brings-next-stage-of-augmented-reality/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Thu, 17 Feb 2022 17:00:00 +0000</pubDate>
<category><![CDATA[AR]]></category>
<category><![CDATA[Augmented Reality]]></category>
<category><![CDATA[Mixed Reality]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9428</guid>
<description><![CDATA[About 11 years ago I was standing outside in the snow in Munich, Germany with the CTO of a small company, Metaio. He was showing me monsters on the sides of buildings. Apple later bought his company. It got me interested in augmented reality and its uses to make people’s lives more fun and more … <a href="https://scobleizerblog.wordpress.com/2022/02/17/invisible-digital-twin-brings-next-stage-of-augmented-reality/" class="more-link">Continue reading <span class="screen-reader-text">New startup mixes reality with computer vision and sets the stage for an entire industry</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="embed-youtube"><iframe title="Invisible digital twin brings next stage of augmented reality" width="1100" height="619" src="https://www.youtube.com/embed/psPh18yjMOo?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>
<p>About 11 years ago I was standing outside in the snow in Munich, Germany with the CTO of a small company, Metaio. <a href="https://youtu.be/WFgIuOPQovo">He was showing me monsters on the sides of buildings</a>. Apple later bought his company. It got me interested in augmented reality and its uses to make people’s lives more fun and more interesting. The way that first demo happened? The building was turned into an invisible digital twin that the virtual monsters could attach themselves to and move around the building.<br><br>Snapchat has, in the past few years, finally brought that tech to consumers with its augmented reality lenses. Things that can turn your world into complete augmented reality scenes, way better than what Metaio showed me 11 years ago. </p>
<p>While SnapChat’s invisible digital twin lets developers do very cool things, it leaves me wanting. Why? So far the Snap platform doesn’t give us many computer vision capabilities. Doesn’t really let us do a whole range of things we want our augmented reality worlds to do (like keep track of your keys).</p>
<p>Today <a href="https://www.perceptusai.com/">Perceptus</a>, from Singulos Research, gives us an important answer to the future of what humans might do with augmented reality, beyond Snap’s filters, which are really designed to make your selfies much more interesting and it does it with a new kind of computer vision that can catalog physical items in your home or factory.</p>
<p>Augmented Reality has so much more potential into changing EVERYTHING in our homes and factories and Perceptus shows us just how we can use digital twins, computer vision, and AI to make our world better than it was before augmented reality arrives.</p>
<p>CEO Brad Quinton buried the lede when we talked yesterday. Journalism term for hiding an important fact until late in the conversation. About an hour in he let drop “you can play virtual chess without even a chess board.” Then he said you don’t need all the physical pieces, either! The invisible digital twin strikes again.</p>
<p>First, why do I have the credibility to make such a statement? 1. <a href="https://www.engadget.com/2010-04-26-voices-that-matter-iphone-how-ben-newhouse-created-yelp-monocle.html">I created the first augmented reality app on Apple’s store on this blog</a>. 2. I wrote major parts of four books about technology. Qualcomm’s head of its augmented reality efforts (Qualcomm makes the inside of Meta’s VR headset), Hugo Swart, tells people <a href="https://read.amazon.com/kp/embed?asin=B086PZMBMW&preview=newtab&linkCode=kpe&ref_=cm_sw_r_kb_dp_M1J9F8SMB9HGEND04163">my latest “The Infinite Retina” is a must read</a>. 3. Siri was launched in my son’s bedroom. That was the first AI app. 4. I had the first video of a Google self-driving car on YouTube. <br><br>I also am not paid for, nor am connected with this company. Although I would work for it for free if asked. Why? This platform has a major impact on the future of things developers will be able to do in our homes. <br><br>This does NOT require a headset or glasses. If you watch the video you will see founder Brad Quinton demoing it on a simple iPad. But, of course, this really will rock when we get wearable devices that will enable us to use this kind of augmented reality without holding a device in our hands.</p>
<p>“So, Scoble, what is it?”</p>
<p>It lets you do magic. Just watch the demo in the video I posted or watch the more professional video on Perceptus’ website. It shows the power of digital twins and computer vision in our homes. </p>
<p>Soon devices will augment everything with this kind of computer vision and an invisible digital twin. That’s how Snap’s filters work, they add a digital twin of your face and of the city around you into its database which developers can then manipulate. </p>
<p>Digital twins are like a 3D copy of the real world. </p>
<p>In the demo Quinton shows, the table is a digital twin. It looks like the real table, doesn’t it? But it isn’t. In this case the digital twin is mostly invisible. Perceptus uses this digital twin to keep track of things like chess pieces and Lego pieces on the table top (and a few other things too, but I’m trying to keep it simple here). Think of it as a new kind of database: one that is laid out on top of the real world.</p>
<p>The magic here is that Perceptus quickly makes this digital twin and figures out what is on top of it. Computer vision running inside says “hey, there’s a rook in a chess game” and it keeps track of that rook from then on and, from then on, developers can perform their own magic (for instance a developer might want to change that rook into something crazy, like a Sponge Bob character). </p>
<p>Don’t focus too much on the two examples Brad demos. The game and Lego organizer aren’t the secret sauce here. They are just examples of things that COULD be built on top of this platform. As I walk around the home I see dozens of things developers could do with Perceptus, from making new musical instruments on top of Coke cans to making all our board games far more interactive and interesting. It could be used for non entertainment purposes too, like suggesting recipes, or keeping track of things around your home. All my lights have computers in them and they are hard to control. A developer could use this platform to make that much easier. Or could make our Heinz ketchup bottle play games against the salt shaker next to it. </p>
<p>Some details behind the company?</p>
<p>So far it’s self funded. He says he isn’t raising funds right now but I’ve learned many times that entrepreneurs are always raising funds even when they say they aren’t. The valuation just hasn’t gotten interesting enough yet! (My words, not his). </p>
<p>This isn’t his first company, he cofounded Veridae Systems, Invionics, and Abreezio (acquired by Qualcomm) and has 28 patents to his name. </p>
<p>You can learn more at <a href="https://www.perceptusai.com/" rel="nofollow">https://www.perceptusai.com/</a><br><br><strong>More in depth about my thinking</strong></p>
<p>If you have read this far you are really crazy about augmented reality. My kind of people! So, why am I so strongly excited by this company? </p>
<p>Well, we all know Apple is coming with some sort of head mounted display (technical term for something like a headphone that has screens for you to look at). There have been tons of rumors, lots of ideas of what is coming. </p>
<p>I’m not playing that game anymore, except that whatever comes will be the most expensive product launch of all time in any industry. So expectations are extremely high and whatever Apple will do will change the opportunities for developers.</p>
<p>This is a big example of just what kinds of startups are about to come. I’m expecting that over the next 24 months we’ll see hundreds of startups, like this one, created. That has me excited, but even better, I’ve been doing investor research lately and a huge percentage of them are waiting to see what Apple is doing before even considering whether to invest in virtual or augmented reality. I need more time to finish off that research to present real numbers, but it’s already clear after asking 100 investors. </p>
<p>Major companies are already talking to me about augmenting their products in our homes. Tonight I was talking to an employee at Samsung about its appliances and how they would augment them. Then there are consumer products companies like Procter and Gamble. If someone there wants to augment a Tide bottle it’ll need a computer vision platform. Does this work for every use case? The market will decide but it shows me that magic can be brought by developers to EVERY object in our homes. Forks? Yes. Refrigerators? Yes. Board games? Yes. Spices? Yes. </p>
<p>I’m tracking companies that will have a unicorn potential in this augmented reality world. This certainly is one. </p>
<p>Anyway, the next 16 months will be huge for augmented reality and this is just another example why and it demonstrates that machine learning/computer vision and digital twins are about to become much more useful to consumers.</p>
<p></p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
</item>
<item>
<title>Dolby Atmos’ role in better digital experiences</title>
<link>https://scobleizerblog.wordpress.com/2022/02/10/dolby-atmos-role-in-better-digital-experiences/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Fri, 11 Feb 2022 00:27:06 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9405</guid>
<description><![CDATA[First of all, what is my goal? It is simply to find ways to make my systems (which include many headphones, cars, and Sonos system) sound better and to help you enjoy your system. There is a bigger goal I have, which is to get the Spatial Computing industry to care a lot more about … <a href="https://scobleizerblog.wordpress.com/2022/02/10/dolby-atmos-role-in-better-digital-experiences/" class="more-link">Continue reading <span class="screen-reader-text">Dolby Atmos’ role in better digital experiences</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<figure class="wp-block-image size-large"><img width="1024" height="682" data-attachment-id="9419" data-permalink="https://scobleizerblog.wordpress.com/1781851_10152349705079655_1184558509793491090_o-2/" data-orig-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2022/02/1781851_10152349705079655_1184558509793491090_o.jpeg" data-orig-size="2048,1365" data-comments-opened="0" data-image-meta="{"aperture":"0","credit":"","camera":"","caption":"","created_timestamp":"1574861054","copyright":"","focal_length":"0","iso":"0","shutter_speed":"0","title":"","orientation":"1"}" data-image-title="1781851_10152349705079655_1184558509793491090_o" data-image-description="" data-image-caption="" data-medium-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2022/02/1781851_10152349705079655_1184558509793491090_o.jpeg?w=300" data-large-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2022/02/1781851_10152349705079655_1184558509793491090_o.jpeg?w=1024" src="https://scobleizerblog.wordpress.com/wp-content/uploads/2022/02/1781851_10152349705079655_1184558509793491090_o.jpeg?w=1024" alt="" class="wp-image-9419" srcset="https://scobleizerblog.wordpress.com/wp-content/uploads/2022/02/1781851_10152349705079655_1184558509793491090_o.jpeg?w=1024 1024w, https://scobleizerblog.wordpress.com/wp-content/uploads/2022/02/1781851_10152349705079655_1184558509793491090_o.jpeg?w=2046 2046w, https://scobleizerblog.wordpress.com/wp-content/uploads/2022/02/1781851_10152349705079655_1184558509793491090_o.jpeg?w=150 150w, https://scobleizerblog.wordpress.com/wp-content/uploads/2022/02/1781851_10152349705079655_1184558509793491090_o.jpeg?w=300 300w, https://scobleizerblog.wordpress.com/wp-content/uploads/2022/02/1781851_10152349705079655_1184558509793491090_o.jpeg?w=768 768w, https://scobleizerblog.wordpress.com/wp-content/uploads/2022/02/1781851_10152349705079655_1184558509793491090_o.jpeg 2048w" sizes="(max-width: 1024px) 100vw, 1024px" /><figcaption>Photo by Robert Scoble. Empire of the Sun performs at Coachella. So far the experience of concerts are out of reach of most people. Dolby Atmos promises to close the gap between “real life” music and recorded music you can enjoy in your home or car.</figcaption></figure>
<p>First of all, what is my goal? It is simply to find ways to make my systems (which include many headphones, cars, and Sonos system) sound better and to help you enjoy your system.<br><br>There is a bigger goal I have, which is to get the Spatial Computing industry to care a lot more about audio, since music is a foundation for a lot of storytelling, and a huge amount of the difference of, say, watching the Olympics in person or on TV. Even “silent” films from 100+ years ago have music as part of the story. What is Dolby Atmos and why am I so excited by it (enough to make playlists that have tens of thousands of songs on them, all in Atmos, and to talk with the music industry frequently)?</p>
<p>Neil Young took me into his studio to teach me this “gap” between a real-life performance, what was captured on the master recordings, and what people hear coming from their headphones or speakers. On most equipment the gap is huge. But now consumers are getting systems that greatly close the gap. Or, could, if they are fed music with higher resolutions and with the ability to build surround sound stages that sound closer to real concerts. That’s where Dolby Atmos comes in. </p>
<p>When you go to a real game, the sound is incredible. So far consumers don’t have the equipment, nor the source, to help close that gap. Dolby Atmos closes the gap between a real concert and something you would experience digitally. And closes it in a huge way. It used to be that only wealthy schools could do what my Sonos is now doing.</p>
<p>It gives us several things:<br><br>1. It virtualizes speakers. So that sound can be put all around a listener.<br>2. It turns audio into objects that can be played properly on everything from a $250,000 speaker, to cheap speakers on a modern iPhone.<br>3. It still includes a stereo render so that the music can play on all equipment, even those that don’t support Atmos (since that’s the vast majority of devices that people listen to music on).<br>4. More bit depth, so sound is better quality.<br><br>Putting audio on things, either real or virtual, will be a big deal. The #1 app on Meta’s (formerly known as Facebook) VR headset, the Quest 2, is Beat Saber. Which uses music. But which sounds like crap because it’s 2D music, not Spatial Audio, like what Dolby Atmos delivers. </p>
<p>I’m not paid/compensated in any way by Sonos, Dolby, Apple, Amazon, or any company I discuss online (if I ever am, I will disclose that).</p>
<p>My qualifications? <a href="https://music.apple.com/profile/scobleizer">I’ve collected tens of thousands of Dolby Atmos songs on Apple Music</a> and moved a lot of those over to other services like Tidal and Amazon so that I can compare music services. If you want help with a specific kind of music, drop me a line, and I’ll send you my playlists on Amazon or Tidal. Apple is the best place because Apple has the biggest catalog of Dolby Atmos that I’ve been able to find. Amazon sounds better, even on Apple headphones, due to using newer Dolby Atmos technology than the others, but it has fewer songs, particularly for those of you who like classical music.</p>
<p>Spatial Audio is something I’ve been studying for decades. I’ve been in Virginia Tech’s building for Augmented Reality research which has 1,600 speakers in one room. When I visited they put me in a recorded football game which blew my mind. <br><br>Neil Young had me in his studio to understand what his analog masters caught of his performance and how much of that is stripped away by technology delivering music at home. I’ve visited with many audio engineers in many studios. Just so you understand that while I’m not an audio engineer, I do have more education on the topic than most people who aren’t audio engineers and I’ve even met audio engineers who are completely working in stereo and who don’t understand Atmos. The music industry is cleaning those people out and building new studios around the world for Atmos.</p>
<p>++++++++++</p>
<p>What is Dolby Atmos?</p>
<p>Instead of talking about bits and bytes and nerdy stuff, let’s ask ourselves “what is the goal for us to recreate music recordings in our home?”</p>
<p>For me, I have seen hundreds of performances from the front row. Buddy Guy played guitar sitting right next to me for 20 minutes. Reggie Watts performed two feet in front of me in Preservation Hall with the band there. I’ve been to Austin City Limits, Coachella, and many festivals. </p>
<p>My goal has been to recreate this concert experience at home. Most people can’t afford to go to, say, Coachella, to listen to live music. Marshmellow played there to about 30,000 people. A few years later he was on Fortnite performing to 11 million. And it sounds like shit compared to Atmos on my Sonos system. The gap between the performance on Fortnite and what his concert in real life was like is huge. <br><br>We can do better.</p>
<p>So, let’s start out with something simple.</p>
<p>A three-piece band. A singer. A drummer. A guitar.</p>
<p>The old way was to record in “stereo.” Two audio channels. And distribute that at 44.1 khz, or worse (Spotify is usually compressed on top of that). That is what is on CD’s (which I first started selling around 1980 in the consumer electronics store I worked at).</p>
<p>In stereo the band is “stuck” mostly between your two speakers with a little that goes outside of that. When we sold audio gear in the 1980s we’d say “this speaker has a wider soundstage.” Because in front of you you could more clearly hear where a flute or clarinet was in, say, a symphony. <br><br>Today, with Atmos that soundstage can be all around you. The guitar can be in a specific place in 3D now. That wasn’t the case with stereo. In theory Atmos can move things around differently in the future, too. The Atmos technology has separated where sounds are coming from apart from the sounds themselves. </p>
<p>The process of making Dolby Atmos is different than it used to be. Now a technology team needs to “program” where sound should be around you.</p>
<p>That is impossible to do in just stereo. Audio engineers sometimes even put each performer on a different speaker in their studios and move those physical speakers around to decide where to put each on the computer, which translates to your Sonos as “drums in back, guitar on right, singer in front.”</p>
<p>In my home, with my system, this sounds like my whole room comes alive with audio that extends behind my speakers, over my head, and even behind me if I have the speakers directly to my sides, or a little behind me. Sometimes it even “fakes” sounds behind you where there aren’t any speakers. You can hear this actually on an iPhone 13 Pro or Max phone. Play Dolby Atmos music on such, make sure you see the Dolby Atmos logo (if you don’t there are a couple of places in settings you need to change, and you need to pay for the top tier of music services (particularly important on Amazon Music) to get it. Turn your phone and you’ll notice that sound often appears like it’s coming from next to you or, even, behind you where there are no speakers!</p>
<p>I also noticed in my headphones and in my car that Atmos music has better bass and music clarity when Apple turned on Atmos last year. That’s what got me interested in Atmos. Soon after I had purchased my Arc bar and started talking with musician friends and audio engineers about this. </p>
<p>The reason, the engineers tell me, that Atmos has better bass and audio, is because when the music is sampled it not only has more samples (48 khz vs 44.1) but also has more bit depth (the numbers are longer, which makes for better sound, thanks to more information). <br><br>Some of those benefits come from “high resolution” music (music recorded at higher sampling rates than what was used to record CDs) but if you have a full surround sound system you hear it isn’t just that: that the music is fundamentally different than it used to be.<br><br>Some purists argue that it isn’t what we should do. That stereo is how “God” made recorded music and we shouldn’t mess with that. <br><br>I come from a different place: if we really are going to take audio to the next level we must go way beyond stereo.<br><br>I hear another resistance: that there is a better way to do Spatial Audio, particularly in VR, since the platforms that make VR can put sound on any of the virtual polygons that make up what you are seeing in a VR headset. The problem here is that the music industry has decided to go with Dolby Atmos, or a similar technology, 360, from Sony. </p>
<p>So, we need to see VR headset manufacturers really support Dolby Atmos at a deep level and make it so that the virtual box around the listener can be “attached” to the real world, so that we can really recreate a concert experience (I’ve been on stage at concerts where musicians are playing and in real life you can hear what it sounds like between, say, the guitar and drums. You can’t do that at home yet.</p>
<p>+++++++++++++++++++++<br><br>What about the future? <br><br>The holy grail is to “fool” the listener into thinking she/he is at a concert, where sound is coming from all around you, particularly when you are at something like Coachella. There they have dozens of speakers in front, above, and behind you, and that sound is bouncing off of everything else. <br><br>We aren’t there yet. <br><br>Now add on augmented reality or virtual reality glasses. They could “lock” the Atmos virtual box to the real world, letting you “walk around” a band. Like you can in a real concert. <br><br>It is this goal that has me most excited. If we can put a 3D sensor on your face, along with screens that cover your eyes, and headphones that bring real surround sound, and cover your ears, we can deliver much better audio than headphones can today.<br><br>So far we haven’t seen this “holy grail” ship to consumers. I expect that in the next year that will change.<br><br><a href="https://music.apple.com/profile/scobleizer">Which is why I’m collecting all the Atmos music I can</a>. When you get one of the devices that I know is being built by Apple, Meta, Tesla, and others, you’ll be able to hear the magic of Atmos. <br><br>Until then, start “future proofing” your playlists, by choosing to collect Dolby Atmos. By the end of next year you’ll see that will be much more important than it is today. <br><br>Music never sounded so good!</p>
<p>Here are the largest collections of Dolby Atmos music anywhere:</p>
<p>Apple: <a href="https://music.apple.com/profile/scobleizer">https://music.apple.com/profile/scobleizer</a></p>
<p>Amazon: <a href="https://music.amazon.com/profiles/s5yxpchb5am5nbpi5ylmmrqrry?ref=dm_sh_gEk0YWvS1hxV9Tgn0JGEfkFuQ">https://music.amazon.com/profiles/s5yxpchb5am5nbpi5ylmmrqrry?ref=dm_sh_gEk0YWvS1hxV9Tgn0JGEfkFuQ</a></p>
<p>Tidal: <a href="https://listen.tidal.com/playlist/6603219c-65cf-4299-87e9-54cad6e729ab">https://listen.tidal.com/playlist/6603219c-65cf-4299-87e9-54cad6e729ab</a> (search for “100% Dolby Atmos” to find 50+ other lists).</p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
<media:content url="https://scobleizerblog.wordpress.com/wp-content/uploads/2022/02/1781851_10152349705079655_1184558509793491090_o.jpeg?w=1024" medium="image" />
</item>
<item>
<title>The Augmented Home</title>
<link>https://scobleizerblog.wordpress.com/2022/01/11/the-augmented-home/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Tue, 11 Jan 2022 23:44:01 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9353</guid>
<description><![CDATA[In Silicon Valley we like to think about “first principles.” What does this really mean? Go back and really study how things work without bringing your biases to the table and then come up with new products and services that help you do that task better. Steve Jobs went further. When he started the iPhone … <a href="https://scobleizerblog.wordpress.com/2022/01/11/the-augmented-home/" class="more-link">Continue reading <span class="screen-reader-text">The Augmented Home</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<figure class="wp-block-image size-large"><img width="964" height="1024" data-attachment-id="9354" data-permalink="https://scobleizerblog.wordpress.com/screen-shot-2022-01-11-at-12-29-46-pm/" data-orig-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2022/01/screen-shot-2022-01-11-at-12.29.46-pm.png" data-orig-size="2212,2350" data-comments-opened="0" data-image-meta="{"aperture":"0","credit":"","camera":"","caption":"","created_timestamp":"0","copyright":"","focal_length":"0","iso":"0","shutter_speed":"0","title":"","orientation":"0"}" data-image-title="screen-shot-2022-01-11-at-12.29.46-pm" data-image-description="" data-image-caption="" data-medium-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2022/01/screen-shot-2022-01-11-at-12.29.46-pm.png?w=282" data-large-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2022/01/screen-shot-2022-01-11-at-12.29.46-pm.png?w=964" src="https://scobleizerblog.wordpress.com/wp-content/uploads/2022/01/screen-shot-2022-01-11-at-12.29.46-pm.png?w=964" alt="" class="wp-image-9354" srcset="https://scobleizerblog.wordpress.com/wp-content/uploads/2022/01/screen-shot-2022-01-11-at-12.29.46-pm.png?w=964 964w, https://scobleizerblog.wordpress.com/wp-content/uploads/2022/01/screen-shot-2022-01-11-at-12.29.46-pm.png?w=1928 1928w, https://scobleizerblog.wordpress.com/wp-content/uploads/2022/01/screen-shot-2022-01-11-at-12.29.46-pm.png?w=141 141w, https://scobleizerblog.wordpress.com/wp-content/uploads/2022/01/screen-shot-2022-01-11-at-12.29.46-pm.png?w=282 282w, https://scobleizerblog.wordpress.com/wp-content/uploads/2022/01/screen-shot-2022-01-11-at-12.29.46-pm.png?w=768 768w" sizes="(max-width: 964px) 100vw, 964px" /></figure>
<p>In Silicon Valley we like to think about “first principles.” </p>
<p>What does this really mean? Go back and really study how things work without bringing your biases to the table and then come up with new products and services that help you do that task better. </p>
<p>Steve Jobs went further. </p>
<p>When he started the iPhone team he told the first 12 people “do not hire anyone who has ever worked on a phone before.” </p>
<p>Why? </p>
<p>He wanted a fresh look at the industry and he wanted to see how new technology and a new approach could change the game. Elon Musk recently did the same when he and team designed the Cybertruck. They threw out all our beliefs about what a truck should even look like. So many people have been polarized by that design, which shows the power and danger of taking such an approach.</p>
<p>As we see companies like Apple and Meta, er, the company formerly known as Facebook, try to introduce new devices into the home, that will “augment” the home in various ways, it’s useful to study what people are doing in their homes, so you can judge how much usage they will get, and how accepted they might be, or, even, for competitive analysis to see who is strongest at a specific use case in the home.<br><br>I’ve been fortunate to have been invited into people’s homes all over the world. Rich, poor, and everything in between. That let me observe what people actually do, vs. what I think they do. And what they have and how they spend their time.</p>
<p>That turned into a framework, a simplified version shared here, to help everyone think through new companies in a new way. If only to help build competitive analysis in the future.</p>
<p>This led me to notice that very few people do “high engagement computing” while walking around. <strong><em>They almost always are sitting down when doing high-engagement computing </em></strong>(although if you have a standing desk that messes things up a bit but very very few people have one of those in their family rooms so we’ll leave that for a future discussion). <strong><em>This tells me the augmented reality industry has focused on the wrong type of experience. </em></strong></p>
<p>Apple and Unity aren’t going to continue making this mistake, because they have done the kinds of human factor research I have. Apple even built a fake home to study these different contexts in the home. Apple’s strategists are focusing on this high-engagement computing, I hear from people working on its new products.<br><br>How do I define engagement? Are you touching or interacting with the computer at a high rate?<br><br>“Low engagement?” Things like listening to music (you do realize that is computing, right? Try playing music in your headphones, car, or home without a computer). But you rarely touch a screen or interact with the music. Other than to start up a playlist or play a song. Yeah, we could say if you are doing a ton of searching, liking, curating, etc, you are moving toward high engagement. I still include that in low engagement computing.</p>
<p>High engagement? If you are designing something in a 3D app, or working on a spreadsheet or doing a slide deck, or even answering email, I put that in high-engagement camp. </p>
<p>While following people around and understanding what they actually do, both are important. An architect can design a new building while also listening to music. One is a high-engagement task. The other low. </p>
<p><strong>Generational shifts</strong></p>
<p>Visit a home with two teenagers and you’ll see that while the parents do most of their entertainment-focused computing in the family room, that isn’t true of the kids. Some are very focused on being in their rooms, watching videos, or, more importantly, gaming. Some families even banish the gaming system to the garage, so that when adults are having a party the kids are out there playing to their heart’s delight.</p>
<p>Apple has new eyes on these uses of technology in the home. Can Apple win back the kid who has a gaming console or a PC in their bedroom? My research shows yes, but that will be very tough short term, particularly if Apple is planning on charging $3,000 for its visual headphone. </p>
<p>Kids also have their own media and music preferences that often clash with that of adults. Which is why they are relegated to the hinterlands of bedrooms or garages also. Can a new kind of headphone change that mix? Yes. But we need to see what actually happens vs. what our theories about what will happen. Lots of unknowns here, like the role of AAA games, and what the music and metaverse industries push to the kids.</p>
<p><strong>What is the conclusion?</strong></p>
<p>That there is a new unserved market for VR/AR companies: focused around people that are sitting down to do a variety of high-engagement tasks. One that is now working at home, along with doing other high-engagement tasks. How many couples are now both working at home, struggling to deal with Zoom calls and using new collaborative tools. </p>
<p>Even when the kids come out to watch a movie with their parents, you see another thing.</p>
<p>Everyone is still on their personal devices, like phones or tablets — which are completely crappy screens compared to an 83-inch TV for watching video content. Still the mobile phone persists, even in such a home. TikTok is the biggest beneficiary of this new behavior. Even while watching TV or a movie, things that 20 years ago would get everyone angry at you for being distracting. Now everyone is in this state of continuous partial attention. Former Apple/Microsoft exec Linda Stone told me that was coming back when I worked at Microsoft 17 years ago. Here we are. Watching a Netflix show on the big screen while watching TikTok videos on the small. </p>
<p>This tells me that if a new device gets launched that lets you “soak” in both worlds that it will be very popular. Of course I’m thinking of Apple, but others will try to compete for the home. I don’t think they will be very successful short term, but we’ll see why when Apple comes in June.</p>
<p>When Uber was invented it was after the iPhone shipped. I was there in the Paris snow storm when Garrett and Travis were complaining, saying “why can’t we see where the cars are on our phones?”</p>
<p>The introduction of a new computing device let us think differently about a task we had done thousands of times before (getting a ride in a taxi).</p>
<p>Same will happen in the home. </p>
<p>After we get augmented reality devices we will change what we want in the home. </p>
<p>But, like with Uber, the businesses that take on things we already do will have a major leg up on companies that will try to introduce a completely new behavior.</p>
<p><strong>The big money spot</strong></p>
<p>So, where is the sweet spot? The one that will end up with most of the competition and money.</p>
<p>Let’s focus on just the family/living room. You know, the one room that has the best TV in the house. When I visit homes I look for commonalities. Literally everyone has a TV in their family room. Even in very poor homes with dirt floors. Almost always had a TV and if they didn’t have one, a neighbor had one and invited everyone over to watch the local football team (soccer in most of these places).</p>
<p>While in the family room, I keep track of who has a state-of-the-art 2020 or newer TV or audio system? </p>
<p>Even in “rich” Silicon Valley literally no one does. This is key for Apple, because a very tiny percentage of people have a surround sound system or a big TV that is capable of 4K or better resolution. </p>
<p>Our family invested in a $20,000 media room, complete with everything state of the art, or at least state of the art as of last June (and state of the art for that price level — the billionaires sometimes have $250,000 rooms but I study mass market consumer behavior, not that of billionaires).</p>
<p>Doing this research also led me to study people’s reactions as they visit our home (we had a family here from Texas for 10 days at Christmas time, too, so got a ton of insights from that).</p>
<p>Even with such a mind-blowing system people still are poking at their phones, looking at TikTok or playing a variety of games or watching YouTube. I started keying in on this. <strong>Conclusion: custom media beats high quality media. </strong>People regularly tell me they won’t switch from Spotify, even though it has easy-to-demonstrate far shittier music quality than, say, Amazon or Apple Music on my new Sonos system. Why? Personalization beats quality. Over and over.</p>
<p>What if someone gives consumers both? </p>
<p>My research shows that such a device will sell very very well.</p>
<p>Other things to pay attention to if you are starting a company that is going after home users?</p>
<p>Home users multi task. How many of you watch a TV show, while dinner is cooking, or while you are actually working on a different screen? Many of you keep TV on in the background all day long. Others, music. While I’m typing this on my iMac, which is in our kitchen (Maryam is working in the office down the hall) I’m watching what’s going on over on Twitter (TweetDeck is on our big screen) and what is going on around me (Maryam emerged from her office to get some lunch). A bunch of high and low engagement behaviors all mixed together. Some of which will get us to put on headphones. Some of which get us to take them off.</p>
<p>A funny thing I discovered with homes that have a modern Sonos in them, by the way, is that very few percentage of people who have purchased a modern AV system have them properly tuned and setup (Sonos has a feature called “TruePlay” that makes sound dramatically better if you use it right. Many even hyper-rich people haven’t done it at all. Plus, it took me months to figure out that I wasn’t getting the best picture quality on my TV because of one setting I missed). Most people also have small TVs. Only a tiny percentage of homes, even in “rich” Silicon Valley have bigger than 65-inch TVs.</p>
<p><strong>The resistance</strong></p>
<p>This means there’s a lot of resistance to devices that cause task switching. You hear a ton of resistance when doing market research. Many many people tell me “I will never wear anything on my eyes.” Many tell me that and then pull headphones out of their pockets when I ask. Never believe what people tell you, always watch. I have thousands of examples where people told me “I’ll never do that” and then three years later they are doing that same thing all the time. Same here. It’s hard to see how a VR device would gain so much mainstream acceptance that everyone will wear it for hours every day. I’ve studied several paradigm shifts close up. Right before each hit, and even as each was hitting, there was extraordinary resistance. I got yelled at in a coffee shop when I was trying to complete the first credit card transaction in that store, for instance, and when I introduced email to the office I was working at for the first time people complained, saying they liked paper memos better.</p>
<p>Even people with VR headsets rarely brought them into the family room. Already they have been relegated to other spaces for a variety of reasons. Why? The task switch from being immersed in, say, Beat Saber, to making sure the kids are doing their homework, or making sure your food isn’t burning, is extremely high and keeps VR from being used nearly as often as it should be. </p>
<p>Such headsets also bring new problems into family rooms (my son hurt someone who walked into his playspace while he was playing a high-engagement game and swinging around wildly). I hear Apple has focused on this task switching a lot, along with the safety issues in the home (one reason we’ll see how it solved the problem in June at WWDC when it introduces its new visual headphone).</p>
<p>Also, the person who is watching a TV show, while playing Words with Friends on her iPhone, still wants to be social with others in the room. While you COULD use, say, Apple headphones while others are in the room (I tested this at Christmas dinner, everyone got used to me being a dork, and I could listen to music while still talking to them with the transparency mode), very few feel comfortable with wearing a device while trying to be social. Will that change? I predict it will based on my research, but Apple has to solve the task switching problem. It’s possible for it to be watching and listening to everything for you. What if you were playing Beat Saber and it told you “the kids are now playing on their computers.” Or, “your oven timer is going off.” Or, “a smoke alarm is going off, so we are pausing all high-engagement computing until you get it to go off.</p>
<p>I asked lots of questions about what would happen in the home if everyone had a device on, and new kinds of experiences were introduced (like new kinds of board games you could play with your kids). Here people are intrigued, but say it depends a lot on the execution. So far only weirdos like me try wearing devices. My research shows that could change in a huge way.</p>
<p>The catalyst? Football. What gets people to come over more than any other event? Football. Since I have the biggest, sharpest, TV in our family/friend group, I find that every Sunday I have people over here to watch the TV. That didn’t happen before.</p>
<p><strong>The Context is key</strong></p>
<p>If you are building a game that is really about walking around the world, er, your neighborhood outside, it will have a tough time serving the “sitting on your couch” context.</p>
<p>Let’s back up. I wrote a book about contextual computing. Back when Shel and I wrote that we didn’t have the kinds of AI we do today. Today we have new kinds of AI that hasn’t yet been introduced into the home.</p>
<p>If you are dancing, the potential things you will do are a lot different than if you are lying down watching TV. </p>
<p>If you are building augmented reality services for the home thinking through the context is key.</p>
<p>There will be different distribution channels to get to those contexts too. Would you launch a sports app the same places you launch a math app? No! So, people who are helping entrepreneurs, like me, have to keep track of each and build relationships with those who run such. I already have a list of 2,200 companies building the 3D Internet (AKA metaverse) and am rethinking them thanks to this context map I posted here.</p>
<p>Already well-funded companies are going after each of these contexts. So I’m tracking the money spent and also what new startups are up against as they try to get consumers to take time they used to use to watch TV shows or movies to do something new, like playing multi-player augmented reality games in our homes. That competition will lead to many millions of words spent, and lots of opportunities for partnerships and mergers. </p>
<p><strong>What does this mean for entrepreneurs?</strong></p>
<p>Each will need to know what they are up against. If you are competing for sitting on the couch time you are up against Ted Lasso, football, and everything else that comes. Unity, alone, is going to make a HUGE play for this space. Becoming “the virtual refrigerator” company will be a lot easier, based on my experience, for a low-funded startup than trying to be “the virtual coffee table company.” That said, if you have a real shot at the coffee table the minutes spent there in each home is much higher than the time spent at the refrigerator. Each choice you make will have consequences down the road (and different financing and staffing needs).</p>
<p>Look for what are the big companies not yet doing? Lots of analysts will track that, but I don’t know anyone who has approached each context in the home yet.</p>
<p>This is why I am talking with so many startups lately, all for free with no strings attached. This lets me figure out how the market will break down, and where to look for newer approaches for each. My phone is open at +1-425-205-1921, please text first. </p>
<p>I’ll keep this all up to date and keep adding more details based on what else I learn between now and when Apple comes later this year. </p>
<p><br></p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
<media:content url="https://scobleizerblog.wordpress.com/wp-content/uploads/2022/01/screen-shot-2022-01-11-at-12.29.46-pm.png?w=964" medium="image" />
</item>
<item>
<title>Dolby Atmos and Apple’s rewritten audio stack</title>
<link>https://scobleizerblog.wordpress.com/2021/09/30/dolby-atmos-and-apples-rewritten-audio-stack/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Thu, 30 Sep 2021 21:30:49 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9345</guid>
<description><![CDATA[This is a reprint of my email newsletter. Subscribe here. Human storytelling is about to become MUCH richer. I’ve been spending the summer understanding the technology changes coming to consumers soon. Lots of people at lots of companies, from car companies, to audio companies, have told me what is coming. Which made me realize that … <a href="https://scobleizerblog.wordpress.com/2021/09/30/dolby-atmos-and-apples-rewritten-audio-stack/" class="more-link">Continue reading <span class="screen-reader-text">Dolby Atmos and Apple’s rewritten audio stack</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<p><em>This is a reprint of my email newsletter. <a href="https://www.getrevue.co/profile/scobleizer">Subscribe here</a>.</em></p>
<p>Human storytelling is about to become MUCH richer.</p>
<p>I’ve been spending the summer understanding the technology changes coming to consumers soon. Lots of people at lots of companies, from car companies, to audio companies, have told me what is coming. Which made me realize that by Christmas of 2022 we will be seeing a HUGE upgrade in all media quality.</p>
<p>What am I expecting to upgrade? Well, audio is being upgraded right now, so that’s one, and on a high-end audio system you already can experience better music than I did by standing in the front row of more than 150 concerts. When I was last at Preservation Hall in New Orleans Reggie Watts gave a performance three feet from me. Yes, that’s amazing. But what is more amazing is that while only a few people can fit into Preservation Hall, now we have the same audio quality in our homes. Actually better, most of the time, truth be told. </p>
<p>Soon photos will upgrade. So will video, which is already seeing dramatic changes on the high end. So will VR and AR, which will provide radical new experiences in your homes from multi-party video games to new kinds of virtualized TV screens.</p>
<p>Lately I’ve been telling everyone to pay attention to Dolby. Why? Dolby has been working for years on what Apple is about to bring to market.</p>
<p>In this newsletter I’ll focus on Dolby Atmos and the changes that happened the past few months on Apple Music. </p>
<p>What impressed me is that Apple totally rewrote its audio stack on all of its devices, and put two AI chips in its headphones to enable Atmos in headphones. Apple’s audio renderer is the best in the business, Sean Olive, head of R&D at Harman, who makes JBL, told me. He should know, he’s one of the few humans who has built a double-blind audio testing lab, among other feats.</p>
<p>I start with music because music has ALWAYS been an important part of storytelling. My son just forced me to watch a “silent” film from more than 100 years ago. It had music! On a “silent” film.</p>
<p>This is why I am trying to get the VR/AR industry to care about the changes going on with Apple Music and its investments in 3D music. With Dolby Atmos you often feel like you are in the front row listening to your favorite music. It surrounds you, just like if you were very close. </p>
<p>When I grew up I listened to tons of music coming over AM radio (KFRC in San Francisco, amongst others) on a small transistor radio. The quality was worse than today’s iPhone’s speakers (which also just got improved and on my iPhone 13 with iOS 15.1 even has a pretty interesting version of Dolby Atmos which blows away my 13-year-old self). But on Apple’s AirPod Pros or Max headphones you can hear the difference already. Dolby Atmos content is richer, nicer to listen to for long periods of time, and is in 3D around you (yes, we can argue that it’s not the same as on, say, my Sonos gear, but that is missing the point. Dolby Atmos is 3D music AND has way better fidelity to boot. It’s better even on cheap speakers and when you put it on a $4,000 system it’s explosive, and beats most Broadway Plays and concerts for quality).</p>
<p>So, to get you started, this summer I went through tens of thousands of songs on Apple Music, Tidal, and other places, to try to find all available. </p>
<p>I still remember walking into Tower Records back in 1979 when CDs were just coming out and they only had a few dozen to choose from. Now we have 6,000 songs. Yes, still only a fraction of the millions and millions of songs available, but anyone who doesn’t use 3D music in the future will sound lame. Everything from movies to TV shows to new VR experiences will bet heavy on 3D audio, er, Dolby Atmos, and every week we are seeing major new releases in Atmos and the number is going up exponentially. </p>
<p>So, here’s a playlist so you can get up to date. Apple has a new device to play all these coming by Christmas, so you’ll want to definitely come back and listen to all these again on that.</p>
<p><a rel="noreferrer noopener" href="https://music.apple.com/us/playlist/complete-only-dolby-atmos/pl.u-zJYsZ6806L?utm_campaign=The%20Experiential%20World%3A%20Metaverse%20to%20Ted%20Lasso&utm_medium=email&utm_source=Revue%20newsletter" target="_blank">The complete list (way bigger than Apple’s own list)</a>.I then went through all 6,000 songs, looking for the best examples of Dolby Atmos usage. </p>
<p><a rel="noreferrer noopener" href="https://music.apple.com/us/playlist/dolby-atmos-innovators/pl.u-yZ7mFY0ZK0o?utm_campaign=The%20Experiential%20World%3A%20Metaverse%20to%20Ted%20Lasso&utm_medium=email&utm_source=Revue%20newsletter" target="_blank">That led to this “innovators” playlist</a>.</p>
<p>Yes, if you aren’t on Apple Music you are an idiot. </p>
<p>It’s way better sounding than Spotify. Better in Apple headphones than Tidal. So, I’m starting this newsletter to cover the changes for people who can afford $250 headphones and up on the Apple ecosystem. If you have an Android phone, good for you. It doesn’t sound as good as Apple’s ecosystem. I don’t write these to cover why to move ecosystems, but from now on this newsletter will cover everything content for Apple users. If you have another device, I really don’t care. You aren’t experiencing the best media out there if you do. Might be harsh to state that but it’s true and the gap will greatly grow next year.</p>
<p>Over on Apple Music you should follow me, so you can see what I’m listening to in real time. You also can see dozens of other Dolby Atmos playlists. <a rel="noreferrer noopener" href="https://music.apple.com/profile/scobleizer?utm_campaign=The%20Experiential%20World%3A%20Metaverse%20to%20Ted%20Lasso&utm_medium=email&utm_source=Revue%20newsletter" target="_blank">Over on my Apple Music Profile</a> you will see playlists of Classical, Jazz, Rock, Country, and many more. All Dolby Atmos.Soon everyone will understand why this is important as we move to a 3D metaverse. </p>
<p>Either way, I don’t know how often I’ll write these newsletters, but will keep them coming when something big releases. For instance, <a rel="noreferrer noopener" href="https://music.apple.com/us/album/hamilton-an-american-musical-original-broadway-cast/1025210938?utm_campaign=The%20Experiential%20World%3A%20Metaverse%20to%20Ted%20Lasso&utm_medium=email&utm_source=Revue%20newsletter" target="_blank">last week the music from “Hamilton” was released in Dolby Atmo</a>s. This recording made me cry as I realized the music on my Sonos system was better than what I experienced in the real play when it visited in San Francisco. When you get a chance to hear this on an amazing Atmos system, it is revelatory. A friend who was in the studio told me that they arranged each track on speakers in a room to argue about where in 3D each performer should be placed in your home. Wonderful to listen to and much better than the same music released a few years ago in 2-channel sound.</p>
<p>Another one that just got released that’s notable <a rel="noreferrer noopener" href="https://music.apple.com/us/album/sketches-of-spain/832060615?utm_campaign=The%20Experiential%20World%3A%20Metaverse%20to%20Ted%20Lasso&utm_medium=email&utm_source=Revue%20newsletter" target="_blank">is Miles Davis “Sketches of Spain.”</a> On a high-end system you can hear the musicians at the right height and it just is exquisite. My wife and I have listened many times already.</p>
<p>Finally, I will also point to other people who review music and experiences and will bring you the best that I find, <a rel="noreferrer noopener" href="https://www.soundandvision.com/content/miles-davis-gets-re-mixed-dolby-atmos?utm_campaign=The%20Experiential%20World%3A%20Metaverse%20to%20Ted%20Lasso&utm_medium=email&utm_source=Revue%20newsletter" target="_blank">like this one in Sound and Vision about another excellent Miles Davis album</a> in Dolby Atmos.</p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
</item>
<item>
<title>The Apple Table is Set</title>
<link>https://scobleizerblog.wordpress.com/2021/06/08/the-apple-table-is-set/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Tue, 08 Jun 2021 16:24:37 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9341</guid>
<description><![CDATA[The big meal is about to come. We’ve been waiting for years for Apple to reveal its mixed reality products, including visors and glasses. We’ve been seeing the potential coming in other products like the Oculus Quest, the Magic Leap 1, or Microsoft’s Hololens. For years we’ve dreamed about an augmented world. Steve Jobs called … <a href="https://scobleizerblog.wordpress.com/2021/06/08/the-apple-table-is-set/" class="more-link">Continue reading <span class="screen-reader-text">The Apple Table is Set</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<p>The big meal is about to come. We’ve been waiting for years for Apple to reveal its mixed reality products, including visors and glasses. We’ve been seeing the potential coming in other products like the Oculus Quest, the Magic Leap 1, or Microsoft’s Hololens. For years we’ve dreamed about an augmented world. Steve Jobs called them “bicycles for the mind.”</p>
<p>Yesterday it announced a bunch of things to developers and a few of these “human helpers.” Techmeme has all the reports here: <a href="https://www.techmeme.com/210607/p22#a210607p22">https://www.techmeme.com/210607/p22#a210607p22</a><br><br>The most interesting was that, as of yesterday, Apple Music now is available in Spatial Audio. In a way, its headphones that enable that feature are the first to use its new philosophy: “how many ways can we improve lives by including more AI and 3D visualization in products?”</p>
<p>Apple knows that great audio is the foundation of great experiences in visual apps, like video games, entertainment, virtual shopping, education, and concerts. So it makes sense to upgrade all audio, which Apple is in the middle of doing. Another major upgrade comes next year when audio gets locked to the real world.<br><br>Apple is confusing people with all the new audio terms. Lossless and Spatial Audio are two I hear a lot, but Apple hasn’t been clear about why we want either, or both. By this time next year it’ll be clear, or, rather, it won’t matter. Most new media will be available in the better of the two formats, Spatial Audio. That format gives you infinite surround sound and, coming next year, it will be locked to the real world. Really these announcements are about moving us toward the experiential world.</p>
<p>When I say “experiential world” what do I mean? Well, going to a concert or a sporting event is experiential. You experience these things by being immersed in the event you are attending. What Apple is heading toward is shipping a “Holodeck” that will let you attend virtual concerts. Or go virtual shopping. Or attend a virtual school. Among other things. </p>
<p>Apple will, before the end of the year, I hear, announce two products: a brand new iPod that we haven’t yet seen and a new “Holodeck” which is basically a high-end headphone and visor for experiencing mixed reality.</p>
<p>That said, I was expecting a little more about the 3D map. The fact that they showed us a new 3D map, but didn’t give developers a bunch of new capabilities, is interesting to note. That tells me that Apple will be more muted that I was expecting. That signals to me that it will do a “Viewmaster” product approach where the announcement of the product might not need, or have an affordance for, a lot of third-party apps. </p>
<p>Look at Apple Music. To me that’s where Apple is setting the strongest tone about what is to come. It just upgraded a good chunk of its catalog to support Spatial Audio. No external developers needed. In fact, most people who work at music companies have no idea what Apple just did to the music industry. Let’s put it this way: I just cancelled my Tidal and Spotify accounts. </p>
<p>Because of this trend toward experiential services and away from tons of new developer efforts, Apple has continued the course set by Steve Jobs. Apple knows the consumer electronics industry better than anyone. The iPhone was launched directly against the Consumer Electronics Show, held every January in Las Vegas. That wasn’t an accident.</p>
<p>Anyway, now I’ll turn my efforts to the business impact of the moves now that Apple is coming after owning your home. Both on its competitors and on developers that rely on Apple’s ecosystems for their business.</p>
<p>The “Holodeck” (there’s a rumor that Apple calls it “Apple View” but I’ll call it the Holodeck until Apple officially announces) will be aimed at our family/living rooms, entertainment context. I’m not expecting people to wear it a lot outside. I’m counseling entrepreneurs to focus on family time. Playing games, watching TV/movies, reading books, listening to music, etc. Entrepreneurs who have a strong showing for what people will do in their family rooms have a shot at building significant businesses.</p>
<p>Entrepreneurs who I’m working with are finding it challenging to raise money at the moment. That will change over the next few months, particularly after Apple announces. This isn’t the first time investors will change their attitude. Back when AltSpaceVR started up the founders told me that no one would invest in it. Then Mark Zuckerberg bought Oculus and several of those investors they visited called back.</p>
<p>Same will happen here. So, have an investment plan for before the Holodeck gets announced, and a different one for after. Same for your PR plan and your go-to-market plan. </p>
<p>I’ve been talking to a bunch of companies who are aiming to where the puck will be, to use a hockey metaphor. One entrepreneur, Robert Adams, stands out as a good example of an entrepreneur that is building technology that Apple will need in the future. His business, <a href="https://www.globaledentity.com">Global edentity</a>, builds a variety of sensors and AI that sees biomedical identity, among other things.<br><br>Now, that usually sounds dystopian, but Apple is already showing us how to thread that needle: stay focused on delivering products that help humans and minimize the consequences (and with this new genre of technology there are many). The things it showed us yesterday will improve our lives. Spatial Audio, for instance, makes music that sounds way better. That doesn’t sound dystopian now, does it? Nope.</p>
<p>Adams has shown me many new technologies that companies like Apple could add to future products and services. </p>
<p>For instance, one of his inventions looks at the vascular and/or skeletal system of the user. This is something that a camera can do that your human eye can’t. Can you see the blood flowing through people’s faces? Nope but his system can. What does that lead to? All sorts of things, from earlier sensing of disease, to much better identity systems. If Apple wanted to make users much more secure and healthier it could, but it would need to invest in companies like the one Adams is building before a big company as Google or Apple buys them. This is what Apple was doing the past decade: buying a bunch of little companies that you don’t know much about that now are becoming important parts of Apple’s tech stack (like the 3D sensor in the latest iPhones, which came from a small company in Israel, Primesense).</p>
<p>Another of his has ways to make our health better and even further add identity capabilities by adding a smell sensor to a future phone or glasses. Such a sensor and associated AI could even smell that you are experiencing troubling emotions like anxiety, or, even, a health problem. I’ve seen dogs that can smell those, along with other things, on humans and sensors that can “smell” are coming along. They say even a blind dog knows its owner! So why not your phone? Imagine the possibilities.</p>
<p>Anyway, I’m seeing an explosion in entrepreneurial activity as Apple, Facebook, Snap, Google, all spend billions of dollars readying products that will ship over the next few years. So, I’m expecting to talk with a lot more people like Adams. Am available at +1-425-205-1921. </p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
</item>
<item>
<title>The New 3D Apple Arriving at WWDC</title>
<link>https://scobleizerblog.wordpress.com/2021/05/15/the-new-3d-apple-arriving-at-wwdc/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Sat, 15 May 2021 19:27:46 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9273</guid>
<description><![CDATA[I have been talking with hundreds of people across the industry and have discovered that the changes coming to Apple are deeper than just a VR/AR headset. Way deeper. The changes already being worked into products represent tens of billions of dollars of investment (I hear Tim Cook has spent around $40 billion getting ready … <a href="https://scobleizerblog.wordpress.com/2021/05/15/the-new-3d-apple-arriving-at-wwdc/" class="more-link">Continue reading <span class="screen-reader-text">The New 3D Apple Arriving at WWDC</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<figure class="wp-block-image size-large"><img width="1024" height="768" data-attachment-id="9337" data-permalink="https://scobleizerblog.wordpress.com/img_3132/" data-orig-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_3132.jpeg" data-orig-size="4032,3024" data-comments-opened="0" data-image-meta="{"aperture":"1.8","credit":"","camera":"iPhone X","caption":"","created_timestamp":"1566611943","copyright":"","focal_length":"4","iso":"32","shutter_speed":"0.016666666666667","title":"","orientation":"0"}" data-image-title="img_3132" data-image-description="" data-image-caption="" data-medium-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_3132.jpeg?w=300" data-large-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_3132.jpeg?w=1024" src="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_3132.jpeg?w=1024" alt="" class="wp-image-9337" srcset="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_3132.jpeg?w=1024 1024w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_3132.jpeg?w=2048 2048w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_3132.jpeg?w=150 150w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_3132.jpeg?w=300 300w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_3132.jpeg?w=768 768w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>
<p>I have been talking with hundreds of people across the industry and have discovered that the changes coming to Apple are deeper than just a VR/AR headset. Way deeper. The changes already being worked into products represent tens of billions of dollars of investment (I hear Tim Cook has spent around $40 billion getting ready for this new Apple over the past decade). Not all of this will be announced at WWDC. There will be a few announcements over the next year, and really these changes are going to lead to many new products, services, and experiences that will come for decades. This is the fourth paradigm shift for Apple. Previous paradigm shifts brought us the personal computer, the graphical user interface, and the phone. All of which continue changing our lives today, even decades after they were introduced.</p>
<p>First thing you need to know is the changes coming are WAY deeper than just a VR/AR headset, which would be important enough on its own. So, what is Apple getting ready to announce over the next year?</p>
<ol class="wp-block-list"><li>A new realtime 3D map of the entire world.</li><li>A new, rebuilt, Siri and a new search engine.</li><li>A new mesh network built off of your iPhone that distributes AI workloads to M1 chips in your home.</li><li>A new VR/AR headset, that I call “the TV killer,” or “the HoloDeck” with many devices being planned for next decade from glasses to contact lenses. (Arrives in 2022, with glasses to follow sometime before 2025).</li><li>A new kind of programmatic surround sound (Spatial Audio is just the start).</li><li>New 3D experiences for inside cars.</li><li>Eventually a new car service itself.</li><li>A new OS for wearable, on face, computers.</li><li>A new set of tools for developers to build all of this.</li><li>New 3D services for things like music, fitness, education, and more.</li><li>A new, portable, gaming device that will interact with this 3D world.</li><li>A new 3D audio service. <a href="https://9to5mac.com/2021/05/15/comment-apple-music-hifi-spatial-audio/">Leaks from 9To5 Mac about that</a> just today.</li><li>A new kind of noise cancelling, built on array microphones (early version of this is just arriving in the new iMacs) and a new kind of video and audio sharing network, so that we will be able to have all sorts of new “walkie talking like” features.</li></ol>
<p>Wrap these up and Apple is about to announce a major shift: from 2D screens, interfaces, and experiences, to true 3D ones. Eventually, I hear, Apple will even bring 3D to 2D monitors.</p>
<p>These changes have been planned all the way back to Steve Jobs and Tim Cook has been buying many small companies over the past decade, and has been traveling the world visiting factories that no other American leader has visited and quite a few startups, too. Tim Cook knows the competition extremely well, and is about to jump years ahead of everyone, many developers outside of Apple who are familiar with its plans tell me.</p>
<p>Why am I so confident? <a href="https://twitter.com/AppleXRStrategy">Because patents are raining out of the sky</a>. When I worked at Microsoft I got in trouble with the lawyers over patents so they made me sleep with a lawyer (true story, that was my punishment). Over a weekend I learned a LOT about the patent system. A patent is a legal monopoly that lasts for 17 years. So, if hundreds of patents are being released, it tells you a major new set of products and services are coming. They wouldn’t release the patents if they weren’t ready to come to market with products. Doing so would be extremely stupid, the lawyer taught me, because a big company only has 17 years to make money before everyone else can copy it and drop the prices to the floor.</p>
<p>At first I was excited by rumors of glasses and VR/AR headsets. There are many. But developers who are building for Apple told me the biggest strategical shift is the move to build a new 3D map of the entire world. That is the basis for the new Apple and is important for a range of new products, from robots that will do things around your house, to cars that will drive you around, to augmented and virtual reality products that will bring new kinds of games and experiences to all of us.</p>
<p>Disclaimer. Apple is my #2 position in my investments after Tesla. I also am invested in Apple competitors, Qualcomm, Snap, Microsoft, Amazon, and about 50 other companies in a diverse portfolio. That said, I’m very bullish about Apple and expect it to be a lot bigger by 2030 than it is today, because of this strategical shift underway.</p>
<p>They also told me to look far deeper into what is currently in shipping iPhones, Macs, and other products. For instance, the M1 chip inside the new Macs Apple has just released, has about 17% of the chip’s capabilities dedicated to AI workloads. That part of the chip really hasn’t yet been used much yet. It’s sitting in my Mac right now doing nothing. Next to the M1 chip on my new Mac Mini (which is an amazing computer, fast, quiet, and fairly low cost — under $1,000) is an UltraWideBand Chip, which also is mostly unused, even if you just got some of those new AirTags that also has one of these chips inside.</p>
<p>If you’ve read up to here you’ve gotten the basics. Now I’ll dig into each of these, give you my thoughts on what this all means for all of us, and what I expect at WWDC on June 7.</p>
<p>I am tracking all of this on a new Twitter account at: <a href="https://twitter.com/AppleXRStrategy">https://twitter.com/AppleXRStrategy</a>. If you follow that you’ll see mostly retweets of reports. It’ll be very active around WWDC for sure.</p>
<p>+++++++++++++++++++++++++++++++++++++</p>
<p><strong>A world mapped in real-time 3D</strong></p>
<p>Twenty seven months ago an Apple Mapping car drove down my street. On the front were five LIDARs. As they spin the lasers inside “cut up” the world into little virtual voxels (volumetric pixels, think of them as virtual cubes that you can see with Augmented Reality glasses — sort of like how Minecraft or, really, any video game works). I call them digital sugar cubes, to help people understand. Each virtual sugar cube, my street has millions of them now, has some unique things:</p>
<ol class="wp-block-list"><li>A unique identifier. Probably an IP address, but might be proprietary to Apple. Either way, what this means is a computer somewhere else in the world can change the data on the cube, or show users what my street looks like. That part is already in Apple Maps. You can walk down my street virtually already, but that only hints at what will come next.</li><li>A virtualized microphone. Huh? Yes, you soon will be able to talk to the street, to the trees, to, well, anything, including a Coke can or a car moving by.</li><li>A virtualized speaker. So all that can talk back to you as you walk around with future devices.</li><li>A virtualized display. So you can change the Coke can into something else, from a video screen to, well, something like a virtual animated character. </li><li>A virtualized database. So developers can leave data literally on any surface or, even, the air around you.</li><li>A virtualized computer. Think of what you would do if you had a virtualized Macintosh on every inch. Go even further. What if you had a virtualized datacenter on every inch? You could do some sick simulations and distortions of the real world. </li></ol>
<p>Soon your entire house will be scanned in 3D and Apple will, due to new advances in Computer Vision, catalog everything it sees. That might sound scary, and it is, even to me, but it does bring amazing new capabilities which I’ll go into later. I have <a href="https://twitter.com/i/lists/1052973537944694784">a Twitter list of people, and companies, doing this new Computer Vision</a>, and what is coming to market now is absolutely stunning. A camera on your phone already can recognize when it is looking at a Coke can. Just get Amazon’s latest iPhone app. On top there is a new scanning feature that already does this.</p>
<p>To see what 3D scans look like, <a href="https://twitter.com/albn">watch Alban Denoyel</a>. He’s the founder of <a href="https://sketchfab.com">Sketchfab</a>, a service that holds millions of 3D scans people are doing. There’s a whole community of people who are scanning their lives and uploading a 3D scan every day. </p>
<p>This new 3D map is the basis for an entirely new way of computing on top of the real world. Apple isn’t alone in building such, either. Amazon, Tesla, Google, Facebook, and others, including most autonomous vehicle companies, are building the same for various purposes. Just this week <a href="https://nianticlabs.com">Niantic</a>, the largest AR company who did the Pokemon-Go game, announced such a map/platform. <a href="https://ar.dev">It brags</a> that its new platform is an “operating system for the world.” Now think about how much more data that Apple already has about the real world due to the mapping cars driving around, along with many other data sources. Soon, as we walk around the world with headsets or glasses on, we will add a LOT of data to this map, which will be updated in real time.</p>
<p>+++++++++++++++++++++++++++++++++++++</p>
<p><strong>A new Siri and new search</strong></p>
<p>Four years ago I had dinner with the guy who then was running Siri at Apple. I asked him “what are you learning by being at Apple.” He said “I learned Google is learning faster than we are.” <br><br>“How do you know that?”</p>
<p>“We instrumented Google and discovered that its AI systems are learning faster than ours are.”</p>
<p>You know this to be true now. Why? Google Home and Assistant are a LOT better at answering questions than Siri is. So, for the last four years Apple has been buying AI company after AI company and is building a new Siri and a new search engine. </p>
<p>From what I hear this new Siri will outperform Google in at least one hugely important area: It will know what you are looking at, and what you are holding or touching. Imagine looking at my Coke can (I actually drink mostly Hint Water, but CocaCola is a brand every human understands and probably drinks once in a while at least). Using the new Siri you will be able to ask “how much are 20 of these on Amazon?” For the first time a computer will be able to answer. Today Siri (and Google) have no freaking idea what you are talking about when you ask “of these.” (Amazon, like I said, can already do this via a camera, but very few people use that and understand how powerful it is — when you get glasses on your face the affordances change and you’ll see just what I’m talking about).</p>
<p>The rebuilt Siri will be far more flexible, and will be able to hook up to a lot more things. For instance, the old Siri understands me just fine when I ask it “how many people are checked in on Foursquare at the New York City Ritz?” Yes, Foursquare actually has an API and an answer to that question but Siri isn’t hooked up because its AI is an older, inflexible, design that needs a ton of hand coding. </p>
<p>The new Siri will be able to learn about such APIs that exist on the Web much easier, and will write the AI as users search. </p>
<p>+++++++++++++++++++++++++++++++++++++</p>
<p><strong>A new mesh network</strong></p>
<p>If you buy a new iPhone it has in it a new “U” chip for a new kind of network: UltraWideBand. </p>
<p>What does this new wireless chip bring us?</p>
<ol class="wp-block-list"><li>It connects automatically. Which is why your AirTags can be found by other people who have iPhones.</li><li>It brings between seven and 40 megabits of bandwidth, which is more than Bluetooth.</li><li>Location awareness. Each antenna broadcasting UWB encodes into the radio signal where in 3D space each radio is. Which is why you can use your iPhone to find Airtags inside your couch.</li></ol>
<p>Really UWB is a rethought Bluetooth. I already have half a dozen devices in my home that have these chips, with many more arriving soon as I get a few more AirTags for various things in my home. </p>
<p>The more of these devices you have around you, the better the location lock can work. What’s that? Well, let’s say you have a new kind of volumetric game, or a screen on your table in front of you. The UWB network builds a “fingerprint” of your home, so that new devices that arrive know EXACTLY where they are, and even where they are aiming. This is something no other company can do yet. Samsung has started putting UWB chips into their devices but they don’t have the M1 chip sitting next to these things, so Samsung is way behind and, really, who will buy an entire house full of stuff from Samsung? Not nearly the numbers of people who will buy Apple.</p>
<p>Next year, when the VR/AR headset comes out, I will be buying one for each member of my family. As we wear these devices in our living room, playing new kinds of games or watching a new kind of TV together, we will also be able to talk to each other (and, because of the high bandwidth, even send full 3D meshes back and forth) in the real world due to this new network.</p>
<p>+++++++++++++++++++++++++++++++++++++</p>
<figure class="wp-block-image size-large"><img loading="lazy" width="1024" height="768" data-attachment-id="9339" data-permalink="https://scobleizerblog.wordpress.com/img_0295/" data-orig-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_0295.jpeg" data-orig-size="4032,3024" data-comments-opened="0" data-image-meta="{"aperture":"1.8","credit":"","camera":"iPhone 11 Pro Max","caption":"","created_timestamp":"1570401453","copyright":"","focal_length":"4.25","iso":"32","shutter_speed":"0.0033333333333333","title":"","orientation":"0"}" data-image-title="img_0295" data-image-description="" data-image-caption="" data-medium-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_0295.jpeg?w=300" data-large-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_0295.jpeg?w=1024" src="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_0295.jpeg?w=1024" alt="" class="wp-image-9339" srcset="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_0295.jpeg?w=1024 1024w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_0295.jpeg?w=2048 2048w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_0295.jpeg?w=150 150w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_0295.jpeg?w=300 300w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_0295.jpeg?w=768 768w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>
<p><strong>The TV killer “HoloDeck” and “volumetric football” arrives</strong></p>
<p>Steve Jobs said “I think we figured out a way to do it, and it’s going to be fantastic.” He was talking about the product that is coming in 2022. That’s how long Apple has been working on this.</p>
<p>I got a sneak look at stuff being built for this product. It will bring a stunning set of capabilities that will blow away any physical TV you have ever seen. I hear it brings a lot of the capabilities of Star Trek’s Holodeck. Some familiar with Apple’s plans call it a helmet. Others call it a visor/headphone. I’m just gonna call it the Apple Holodeck until some better name comes along. One important point: Apple’s Holodeck covers your eyes and your ears. If the device is off you will not see the real world and you won’t hear the real world. Turn it on and you’ll see a representation of your living room, and you’ll hear everyone around you. The Apple AirPods Max over ears headphones already show us how the audio part of this will work: I wore them for hours at Christmas dinner and I could hear everyone as if I didn’t have headphones on. Push the “transparency/noise cancelling” button and it turns everyone off. Great if there are kids playing with their friends on Discord, like happens in my house every day.</p>
<p>In the back of every Apple store is a million-dollar 8K TV that is about 30 feet across. The Holodeck that will come next year will show 2D virtualized screens that will be way better than that million-dollar TV. Why? In front of you will be two 8K Sony chips, I hear. That will let Apple virtualize TVs to be way way way bigger than that TV in the back of the store. It also will let you have as many virtualized monitors/TVs as you want. So, if you want to build a Las Vegas Sports Book, or a room that has dozens of TVs, you will be able to do so. Imagine being able to watch ALL the football games on Sunday at once? </p>
<p>But doing 2D virtualized screens is just tablestakes (by the way, they will be WAY BETTER than what Facebook or anyone else can do because of the UWB network. They will be better locked to the real world, and the devices will know where they are without using much of the camera or other battery-hungry technology. Other companies can’t match what Apple is about to do.</p>
<p>“And what is that Scoble?”</p>
<p>Apple is about to put a TV on every inch of the world. “Huh? Why would I want to do that?”</p>
<p>Well, you could put a video screen on your ceiling, floor, tables, or, any surface, including wrapping the video around things.</p>
<p>And then 3D stuff can “pop out of” the 2D virtualized video screens. So, imagine a Super Bowl halftime show where a performer can “jump out of” your TV and onto the floor in front of you.</p>
<p>If you haven’t seen a Microsoft Hololens or a Magic Leap you probably have no clue about how amazing this will be. On my Hololens I play a game where aliens blow holes in my real walls and then crawl through the holes. It’s stunning, even on the shitty optics the Hololens has. When Apple does the same, but on a set of chips that are going to blow you away, it will completely change what you expect from the entertainment and education companies.</p>
<p>One such example is I saw a volumetric football service that is being prepared for the device next year. Around you you will be able to watch 2D screens like never before. Your home will feel like you are at the stadium watching. On the ground or table in front of you, and in front of those 2D screens, will be a new volumetric surface. I saw this and it’s stunning. Imagine being able to walk around the stadium, or, even, onto the field while the game is going on so you can study what the quarterback is seeing from his perspective, and, even, try to make the same shot he is.</p>
<p>By the end of next year I no longer will be watching much TV on flat pieces of glass that are small. The TV killer will have arrived and with it, a new kind of VR and AR. Some inside Apple call it Synthetic Reality. What it really is a way to both view the 3D world, and create content for others to enjoy in new 3D metaverses.</p>
<p>I hear the Holodeck will be announced sometime between now and the end of the year. I won’t be shocked if they tease it at WWDC to get developers excited by what is coming and so that everyone knows to watch the keynote later in the year.</p>
<p>+++++++++++++++++++++++++++++++++++++</p>
<p><strong>Every inch of your life will have sound, including inside your new car</strong></p>
<p>When I sold stereo equipment in a Silicon Valley store in the 1980s they only had two channels. In the 80s new surround sound that would put a couple more channels around you arrived.</p>
<p>Now imagine a trillion-channel surround sound system. That’s what’s about to arrive. Sound will be on buttons, on things around your home, and far far more eventually. </p>
<p>Backing up, when you only had two channels to play with musicians couldn’t really recreate the experience of seeing them play live. I’ve had bands in my home. What’s different about that is that you can literally walk through the band, hearing what it sounds like to be between the tuba and the drummer. That’s impossible to do with two channels, and is even not good enough with the bleeding edge of Spatial Audio today.</p>
<p>Tomorrow, thanks to the LIDAR and cameras on the front of the Apple Holodeck, the computer will know everything about the space around you. It can then put a virtualized band onto the floor in your living room. You’ll be able to do what I was able to do. Walk around the band. I’ve heard this demoed and it’s amazing, even for old music. I’ve been talking with a bunch of music companies and they are planning on re-releasing a lot of their masters in new spatial audio format, and next year they will go the full way, locking that audio to the real world. The industry has masters where every instrument is recorded on a separate track starting in the 1980s. Older masters will be improved via AI too, and will be “modernized” for the new HoloDeck.</p>
<p>Also, if you listen to surround sound movies today on headphones, as you move your head left or right all the sound moves with you and isn’t locked to the TV screen, like it is on a system like my Sonos that does multi-channel surround sound. As the Holodeck arrives that will no longer be true, so watching movies or TV in headphones will be stunning compared to today.</p>
<p>Also, the world will soon be able to apply sound to everything. Car companies tell me they are readying the same in their cars coming next year. Buttons, mirrors, steering wheel, will all be able to apply sound to everything. So your buttons can talk to you. Or, if in a Tesla, have fart noises applied. My kids will love that.</p>
<p>+++++++++++++++++++++++++++++++++++++</p>
<p><strong>A new 3D CarPlay arrives</strong></p>
<p>I’ve had at least one automaker say that all of their cars starting this year will have new 3D sensors watching the dash and probably the driver and passengers, to enable new 3D audio experiences in cars. Every button, or, even different parts of the road, can “talk to you.” These new experiences will be supercharged as Apple brings out its new autonomous car service that’s being built. Imagine playing new kinds of augmented reality games while your car drives. The car project, Titan, is rumored to be coming sometime between 2023 and 2027. That said, new kinds of spatial audio and 3D programmatic sound are coming to cars this year.</p>
<p>+++++++++++++++++++++++++++++++++++++</p>
<p><strong>A new OS arrives for on-face wearables</strong></p>
<p>RealityOS, Apple has called it so far. I haven’t seen this, so this is where I’ll be paying most attention during WWDC. But I hear it is years ahead of what Facebook is building in its Oculus Quest standalone VR headsets. We’ll see just how capable it is, and how easy it is to use. I’m expecting that it will be amazing at letting you see both the real world, and the new virtual layer on top of it and that it’ll be so easy to use that someone who doesn’t read will be able to use it, which will bring new people into the computing world (about 800 million people on earth can’t read). </p>
<p>+++++++++++++++++++++++++++++++++++++</p>
<p><strong>A new, portable gaming device coming</strong></p>
<p>The tech press has been reporting on rumors that a new gaming device, like a Nintendo Switch. From what I hear the device is far different than a Switch because it’s designed to integrate into this new 3D world that will arrive next year to our living rooms. What could this do? Well, if I am wearing a Holodeck it will be a controller. If someone is playing with me they could use their device to interact with me in 3D world and play their own games. Out of all the devices I know the least about this, so will be interesting to see what gets announced. I hear that will be announced sometime by the end of the year, but, as always, with rumors of dates, even ones that Apple employees give you, you can never be sure until they get on stage and announce things. If it slips.</p>
<p>+++++++++++++++++++++++++++++++++++++</p>
<p><strong>Next-level noise cancelling is arriving now</strong></p>
<p>In the new iMac they are using the sensors to track your mouth and are focusing the attention of several microphones on your mouth, which will bring that device much better noise cancelling as you do Zoom calls, for instance.</p>
<p>When I worked at Microsoft 15 years ago a researcher showed me the first array microphone that Research had built. </p>
<p>What are array microphones? Well, the new Apple AirPods Max over-the-ear headphones has nine microphones. That’s what an array is. A group of microphones that a computer can “focus” on things.</p>
<p>The magic here is if the computer knows where in 3D space sound is coming from, it will be able to focus on it, or, even, turn it off. Noise cancelling due to array microphones and new AI-based focusing technologies, will bring noise cancelling features that will be hard for others to match (you need an AI chip, like what is included in the M1 processor, to do this well).</p>
<p>+++++++++++++++++++++++++++++++++++++</p>
<p><strong>A new 3D Apple and paradigm shift arrives</strong></p>
<p>Put it all together and you can see Apple is about to unleash a new paradigm-shifting strategy. One that will change all of our lives very deeply and bring us many exciting new things from new kinds of education or concerts that will replace half of your living room with a virtual classroom or a virtual concert hall. That’s why I call it a Holodeck. Wearing the Holodeck will let you visit my kitchen and play games with me. This has deep implications on the future of a number of companies. Spotify looks threatened. So does Google. </p>
<p>I hear Apple is dropping lots of hints about all of this at WWDC. It needs to ship new emulators and tools to developers to enable them to build new experiences for this new 3D world coming soon. Then, following WWDC we will see a number of announcements about new products that will lead into shipping these products in 2022. </p>
<p>Exciting times for the technology industry are coming. And, yes, I still think I’ll be buying a new device from Snap, aimed more at photography and augmented reality than the Holodeck will be, and I’ll be buying new glasses from Facebook too, since it is also spending more than $10 billion to develop theirs.</p>
<p>If you know any of this is wrong, or if I’ve missed something you know about, I’ll be doing an audio Twitter Space today to talk about this (probably starting around 2 p.m.) at <a href="https://twitter.com/scobleizer" rel="nofollow">https://twitter.com/scobleizer</a> and you can email me at <a href="mailto:scobleizer@gmail.com">scobleizer@gmail.com</a> or you can send me a message on Facebook, Twitter, or LinkedIn (or Signal, Telegram).</p>
<p>As these announcements are made I’ll look back at this post and see how accurate I was. We’ll know a lot more by the end of the year for sure.</p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
<media:content url="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_3132.jpeg?w=1024" medium="image" />
<media:content url="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/05/img_0295.jpeg?w=1024" medium="image" />
</item>
<item>
<title>Tesla’s data advantage. Can Apple, or others, keep up?</title>
<link>https://scobleizerblog.wordpress.com/2021/02/21/teslas-data-advantage-can-apple-or-others-keep-up/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Sun, 21 Feb 2021 19:14:23 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9214</guid>
<description><![CDATA[This is a long post. Short version: Tesla is way ahead on data collection and is pulling further ahead every day. Do you ever think about what the cameras on a Tesla are doing? Cathie Wood does. She runs ARK Invest and has made billions by investing in disrupting companies, particularly those who use artificial … <a href="https://scobleizerblog.wordpress.com/2021/02/21/teslas-data-advantage-can-apple-or-others-keep-up/" class="more-link">Continue reading <span class="screen-reader-text">Tesla’s data advantage. Can Apple, or others, keep up?</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<figure class="wp-block-image size-large"><img loading="lazy" width="1024" height="576" data-attachment-id="9216" data-permalink="https://scobleizerblog.wordpress.com/img_1244/" data-orig-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_1244.jpeg" data-orig-size="4032,2268" data-comments-opened="0" data-image-meta="{"aperture":"2.2","credit":"","camera":"iPhone 12 Pro Max","caption":"","created_timestamp":"1613840424","copyright":"","focal_length":"7.5","iso":"20","shutter_speed":"0.00078308535630384","title":"","orientation":"1"}" data-image-title="img_1244" data-image-description="" data-image-caption="" data-medium-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_1244.jpeg?w=300" data-large-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_1244.jpeg?w=1024" src="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_1244.jpeg?w=1024" alt="" class="wp-image-9216" srcset="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_1244.jpeg?w=1024 1024w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_1244.jpeg?w=2048 2048w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_1244.jpeg?w=150 150w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_1244.jpeg?w=300 300w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_1244.jpeg?w=768 768w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>
<p><em>This is a long post. Short version: Tesla is way ahead on data collection and is pulling further ahead every day.</em></p>
<p>Do you ever think about what the cameras on a Tesla are doing? <a href="https://twitter.com/CathieDWood">Cathie Wood</a> does. She runs <a href="https://ark-funds.com">ARK Invest</a> and has made billions by investing in disrupting companies, particularly those who use artificial intelligence. Her favorite company, she reiterated again last week on CNBC, is Tesla. </p>
<p>I do too, and go even further than she does in studying this industry (I have been studying self-driving cars for 15+ years and have <a href="https://twitter.com/i/lists/956617160733818880?s=20">a Twitter list of people and companies building autonomous cars</a>). I even count how many go by my front door. One every few minutes. No one else has a neural network of its type and no other self driving car has been seen on my street. Did you know a Tesla crosses the Golden Gate Bridge every 80 seconds or faster? Yes, that was me counting cars out there. </p>
<p>I spend hours out front every week cataloging what goes past (I have the new over-the-ears headphones from Apple and usually use them outside while walking around talking to dozens of people all over the world building the future –they absolutely rock, by the way). </p>
<p>The photo at the top of this post is the street I live on, right near Netflix’s headquarters, I just shot this on our daily walk. It is part of a multi-billion war over the future of transportation. Most people have no idea how far ahead Tesla is. Including a very smart software developer I talked with today (I won’t name him because that wouldn’t be nice).</p>
<p>Elon just wrote this over on Twitter:</p>
<figure class="wp-block-embed is-type-rich is-provider-twitter wp-block-embed-twitter"><div class="wp-block-embed__wrapper">
<div class="embed-twitter"><blockquote class="twitter-tweet" data-width="550" data-dnt="true"><p lang="en" dir="ltr">Most people have no idea, even though there are so many FSD progress videos posted. Munro understood right away. <br><br>There will be a gap before the next release, but then it will be a step change better.<br><br>Tesla is solving a major real-world AI problem.</p>— Elon Musk (@elonmusk) <a href="https://twitter.com/elonmusk/status/1363054845976961029?ref_src=twsrc%5Etfw">February 20, 2021</a></blockquote><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script></div>
</div></figure>
<p>He’s right. When I talk with people, even techies in Silicon Valley, most have no clue just how advanced Tesla is and how fast it’s moving. Wall Street Analysts are even worse. I’ve listened to EVERY analyst and NONE except for Wood’s talks about the data the car is collecting. None have figured out that Tesla is building its own maps, or, if they have, haven’t explained what that means. (I met part of the Tesla programming team building these features and they admitted to me that they are building their own maps, more on that later for the nerds who want to get into why this matters). <br /><br />She says that the data leads Tesla to doing Robotaxis, which will be highly profitable. She’s right, but that’s only one possible business that Tesla can build off of this new real-time data. Others include augmented reality worlds, GIS data to sell to businesses and cities, and new utilities that will run far ahead of Apple and Google’s abilities. More on that in a bit. These are all multi-billion-dollar businesses and is why tens of billions of dollars are being invested in autonomous technologies, including at GM, with its Cruise division (worth already about 1/3 of GM’s total market value), and Apple has leaked that it’s going to be entering the space in 2024 with an effort that will cost many billions too. </p>
<p>First, some basics. </p>
<figure class="wp-block-embed is-type-rich is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="embed-youtube"><iframe title="FSDBeta 8.1 - 2020.48.35.7 - 4K Uncut Drive - Enlarged Display On Night Mode - NO MUSIC" width="1100" height="619" src="https://www.youtube.com/embed/0RlG-RIbGdQ?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>
<p><a href="https://www.youtube.com/results?search_query=tesla+fsd&sp=EgIIBA%253D%253D">Go and watch some of the Tesla FSD videos on YouTube</a>. </p>
<p>You will see just how it works. A Tesla has eight cameras (most of them on the outside of the car). It also has a radar in the front bumper, and several ultrasonic sensors around the car that can see things closer, like a dog running next to the car). These are all fused together into a frame by software and then 19 different AI systems go to work figuring out where drivable road surface is, where signs are, where pedestrians and bicyclists are, and much much more. </p>
<p>My car shows that it can “see” about 100 yards around the car, which you can see in the FSD (FSD stands for “Full Self Driving”) videos on YouTube. <br /><br />When I say “see” I mean it shows me on my screen where stop signs, lights, pedestrians, other vehicles, and even curbs and other features of the road are. </p>
<p>For people who don’t track the bleeding edge of computer vision (<a href="https://twitter.com/i/lists/1052973537944694784?s=20">I have a separate Twitter list of developers who are doing computer vision</a>, which is how robots, autonomous cars, and augmented reality glasses will “see” the world and figure it out) you might not realize just how good computer vision is. The folks over at <a href="https://chooch.ai/demo/">Chooch AI</a>, a new startup out of Berkeley’s computer science lab, have shown me just how good computer vision is and how much cheaper it is getting literally every month. Their system can be trained to do a variety of things with cameras, even see if you are washing your hands properly (important if you are a restaurant worker or a surgeon). </p>
<p>Their system already recognizes 200,000 objects. On an iPhone. </p>
<p>My Tesla doesn’t show that it recognizes that many things, but what it does is amazing. For instance, if I drive by a traffic cone at 90 m.p.h. it shows the cone. If there are a string of cones my car automatically changes lanes to get away from the construction area and make it safer for everyone.</p>
<p>As I drive down the street it shows parked cars, and, even, garbage cans. But it isn’t showing me the “real” can. It’s showing me what the AI has in its system. This is very important. A lot of people don’t understand just how much data just one garbage can generate. It captures what kind of can it is. How big is it? How is it positioned in 3D space on the road bed? And it does this even on garbage days when there are hundreds of cans out on the street I live on. </p>
<p>Why would this data be valuable? Well, a garbage company might want to buy an autonomous garbage truck. They could use these systems to both drive the truck and have a robot figure out where each can is to pick it up (already our garbage trucks only have one person controlling such a robot). It will soon go way further than that. </p>
<figure class="wp-block-image size-large"><img loading="lazy" width="1024" height="576" data-attachment-id="9228" data-permalink="https://scobleizerblog.wordpress.com/img_0826/" data-orig-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_0826.jpeg" data-orig-size="4032,2268" data-comments-opened="0" data-image-meta="{"aperture":"2.2","credit":"","camera":"iPhone 12 Pro Max","caption":"","created_timestamp":"1611418160","copyright":"","focal_length":"7.5","iso":"125","shutter_speed":"0.0082644628099174","title":"","orientation":"1"}" data-image-title="img_0826" data-image-description="" data-image-caption="" data-medium-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_0826.jpeg?w=300" data-large-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_0826.jpeg?w=1024" src="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_0826.jpeg?w=1024" alt="" class="wp-image-9228" srcset="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_0826.jpeg?w=1024 1024w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_0826.jpeg?w=2048 2048w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_0826.jpeg?w=150 150w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_0826.jpeg?w=300 300w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_0826.jpeg?w=768 768w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>
<p><strong>Why “HD” Maps are So Important</strong></p>
<p>Tesla’s current autopilot/self driving systems have some major flaws (many of these are fixed in the FSD beta, but most owners don’t have that yet):</p>
<ol class="wp-block-list"><li>They can’t see debris in the road.</li><li>They can’t see potholes that have just formed.</li><li>They can’t join together to make energy usage more efficient.</li><li>They can’t see around corners very well.</li></ol>
<p>Now, there are two approaches. One is to put a LOT more cameras and sensors on the car and a LOT more silicon in each car to properly identify things. I hear that will be Apple’s approach, which is why it’s currently looking at LIDAR sensors and I met the LG camera team while touring the Tesla factory, of all such places (they make the cameras in your iPhone) and they said their cameras will soon be a LOT higher resolution than the ones in my three-year-old Tesla. Apple believes it will be able to put a lot more neural network capabilities into each car since it has its own chip manufacturing now. </p>
<p>Disclaimer, I own both Tesla and Apple stock, my number one and two positions.</p>
<p>So, could Apple “beat” Tesla? That is the $64 billion question. I don’t believe so. The photo above shows why.</p>
<p>In Silicon Valley there are already so many Teslas that scenes like this one, on HWY 17 between Los Gatos and Santa Cruz, are quite common. Tesla, I hear, will soon start transmitting data from cars in front of you to your car (and to everyone else too). </p>
<p>Why is this important? Well, one day I was driving in the fast lane of Freeway 85 using my car’s Autopilot/FSD features (my car automatically changes lanes, stops, and basically drives itself already, particularly well on freeways with the above limitations). Cars in front of me started swerving and breaking. Turned out a bucket had fallen out of a truck and was rolling around lane #1 (where I was had three lanes). </p>
<p>I grabbed the steering wheel and took control, also swerving around the bucket. My Tesla hadn’t seen the bucket, although was already assisting me in driving. Doing very advanced braking. Audi taught me just how good anti-lock and traction control systems are by teaching me to drive on ice. They turned those systems off and I instantly spun the car. Turned them on and they were independently braking each wheel to keep my car from losing traction. Something Elon Musk demonstrated to me with electric engines and traction, too (which is why Teslas are REMARKABLE on ice and snow). </p>
<p>Afterward I talked with the programming team and my car, they said, automatically captures TONS of data during such an event (we call that an intervention, because a human had to intervene in autonomous driving and take over). The programmers have a simulator where they can load all that data and actually “walk around” what happened. Chooch shows that the training is so good that an engineer could just circle the bucket and “teach” the AI systems about what that is. </p>
<p>That’s how they taught it to see stop signs and garbage cans, for instance. </p>
<p>So, now a future version of a Tesla will be able to see the bucket and let everyone else know that there’s an object in the road. “But Waze already does that,” you might say. No it doesn’t. Waze requires a human to tap the screen and say there’s an object on the road. But it doesn’t know whether it’s a bucket, a box, a bedspring, or a beam. And it doesn’t show what lane that thing is in. It certainly doesn’t predict, or track, how said object is moving.</p>
<p>I hear that by the end of the year Tesla will turn on such features. Now, what will that look like on the road? All Teslas will start switching lanes to lane #3. You won’t even know why, even if you are in one, until you pass by the bucket in lane #1. </p>
<p>Can Apple match this? Not until it gets a decent amount of cars on the road. Since Apple is aiming at 2024 I think it will be way behind and will find it difficult to catch up.</p>
<p>But it might get worse for Apple, and certainly will get worse for car companies that don’t have these capabilities being built. Why? Maps and Robotoxis. </p>
<p><strong>The next trillion-dollar company</strong></p>
<p>Let’s say Tesla, or, really, anyone, came out with a map that was updated in real time on your phone. Would you continue using Google or Apple Maps that aren’t? Of course not. Here’s why.</p>
<p>One day I was driving my kids on a long trip to pick up some fruits in the Central Valley (I know a strawberry farmer that grows a lot better strawberries than they sell in our local grocery stores). We witnessed a truck burning. Our cameras could have seen that, captured that, and showed it to everyone on the map and kept it updated as firetrucks arrived and the mess cleaned up. On the way back we were caught in traffic at that spot, which was still being cleaned up. The current maps show traffic, but don’t show WHY there is traffic. </p>
<p>In the future you’ll see that truck burning on the freeway and maybe you’ll decide to take a different route, or delay your trip, saving you much time. Can Google match this? Not really. Google has no cameras passing by the fire. No neural network deciding whether that’s important to upload for everyone else. And even if a human snapped a photo, it is already out of date and doesn’t show the current status. A Tesla rolls by every few minutes. </p>
<p>Look at the front of my home on Google or Apple. The photos there are 20 and 23 months old! On Tesla’s map, if my home was burning down, you’d see that fire within seconds of it starting and you’d be able to watch in near real time as roads get closed and emergency gear gets rolled by (a fire station is on our street, so I get to watch that often too).</p>
<p>So, soon, Tesla will have far better 3D maps, that are updated in real time. Google and Apple can’t match it. Why? No data. Tesla has all the data.</p>
<p>Now, Tesla COULD license this data to Apple, so that Apple could stay relevant. Since Apple has $250 billion in cash, that’s quite a possibility. Which is why I own both Apple and Tesla. Apple is building such a 3D map, from scratch, and is doing a ton of autonomous car work. Remember, Apple could buy GM out of petty cash, so I can never count it out (and Apple has an amazing new VR/AR headset coming in 2022, more on that in a different post) and its AR devices will use this new 3D map it is developing. </p>
<p>Tesla, soon, because of the real-time map it is developing, will be able to solve all four of the problems I named above and add major new capabilities. I hear it already is testing caravanning, for instance. This is the ability for an autonomous car in front to control the brakes and acceleration of everyone behind. Which will let Tesla build a string of cars going on long trips. Doing that will save about 20% of power for everyone in the chain. It’s why NASCAR drivers learn to “drift” other cars. Doing so lets them go faster on less fuel. </p>
<p>All of this work will lead to robotaxis.</p>
<p>Think of Uber. That company was invented right in front of me during a Paris snowstorm. The impulse there was to make transportation easier. When I visited a slum in South Africa I heard just how big a deal this was to people. One woman told me Uber changed her life since before then taxis wouldn’t visit the slum, so she had a tough time getting around.</p>
<p>In the research for our latest book, “The Infinite Retina,” I learned that Uber is closer to how we will own cars in the future than other models. Why? When autonomous cars happen having a car sitting in your garage doing nothing will be very stupid. Something only the rich will do. </p>
<p>Now the economics. An Uber costs about $60 per hour. Go ahead, order an Uber and tell the driver “stay here for an hour and charge me.” Keep in mind that it still is losing money at this rate, too. </p>
<p>When autonomous cars come? That cost will go down to less than $10 an hour. This is why Uber invested billions in autonomous vehicle research. For Uber, though, the problems are even worse. It doesn’t make cars. It doesn’t own cars. It can’t force its drivers to buy a new one to get new features. It can’t force its drivers to buy a specific brand. </p>
<p>Let’s talk about the role consistency plays in building a brand. For instance, let’s talk about Starbucks Coffee. Is it great coffee? No. I used to live in Seattle and everyone knows Starbucks’ coffee sucks when compared to high end coffee shops. So, why is Starbucks so loved? (Disclaimer: I own stock in Starbucks too). Because it is ubiquitous AND consistent. Uber is ubiquitous (I took one in Moscow, Russia) but it isn’t consistent for two reasons:</p>
<ul class="wp-block-list"><li>The driver in the car. Sometimes they smoke. I even had one who smelled of alcohol.</li><li>The car itself. Sometimes it’s a brand new Mercedes. Sometimes it’s a beat up Toyota.</li></ul>
<p>Tesla’s robotaxis will fix both problems. Tesla’s economics are that it could rent you a Model 3 for about $10 an hour. Even a top of the line Roadster could rent for $30 an hour. A $200,000 car cheaper than an Uber? Yes! And at these prices Tesla will be HUGELY profitable compared to Uber. The wholesale cost of a Model 3 actually is only $3 an hour. Uber and Lyft can’t compete.</p>
<p>Remember back to that first ride that Elon Musk gave me? What was his sales pitch for Tesla? Fun? Yes! But he really emphasized how electric motors can be made cheaper than gas ones. He didn’t talk about autonomous vehicles back then, or saving the earth. He saw that he could make a car cheaper than a Toyota and, therefor, disrupt the entire industry. Now that more than a million of them are on the road we see he’s right. The lifetime cost of owning a Tesla is lower than owning a Toyota. </p>
<p>And it’s about to get far worse for Toyota.</p>
<p>Why? Well, if you buy a $45,000 Toyota it’ll cost you about $1,000 a month, if you include all the costs. </p>
<p>That’s a lot of hours of driving if a Tesla is $10 an hour. Most people don’t drive that much every month. Even me, I currently put about 30 hours a month into mine (in three years I put 53,000 miles on my Tesla, and I take a ton of long trips — most of my friends put far far less on their cars than I do). So, an autonomous Tesla will cost about $300 a month for me. Far less than owning a car and having it sit in my garage, like both our Tesla and Toyotas are right now.</p>
<p>One last thing. The auto industry asked me to do some research into customer acquisition costs. I went around the country asking “are you ready to get into a car without a steering wheel?” </p>
<p>“Hell no,” is the typical answer I got. One guy in Kansas told me “I’m a narcissistic control freak and there’s no way a computer is going to drive me around.” </p>
<p>Google’s head of R&D told me they had the data to prove that after three rides in one even that guy changes his mind. So I asked a second question “what if the car drove to you and then you had the choice of driving it or not?”</p>
<p>Almost everyone, including that guy in Kansas, said “yeah, I don’t have a problem with what it does when I’m not in the car.” So, Tesla will have very low customer acquisition cost, if any at all (everyone knows a Tesla is fun to drive). </p>
<p>Kraft food execs once told me they spend $34 to acquire a young customer to eat its cheese (in advertising, and other techniques). So, I imagine that Waymo (Google’s Robotaxi) will have to spend a lot more than that, I figure more than $100 for a while, to get people to try its system, which has no steering wheel and is totally autonomous (it just started working without a driver in Phoenix, Arizona, and San Francisco). </p>
<p>Plus, Tesla has a huge advantage in brand too over Waymo. <br /><br />Translation: Cathie Wood is right. Robotaxis will make a crapload of money for Tesla. Now, here’s the rub and why Tesla’s valuation is so high (if you thought it was just a car company it’s extremely overvalued): a robotaxi system doesn’t need many cars on the road. Uber has something around a million drivers. Worldwide. Tesla could build that many cars for only a few billion dollars. Plus, its owners have already funded the building of more than a million already, and with the Cybertruck on the way, I expect to see that sell many millions. </p>
<p>So, Tesla has the brand, the distribution, the consistency, the low-customer acquisition cost, and other advantages (like a supercharger network that let us drive ours across America) to make a ton of profit PER CAR. That is what I’m betting on, and it’s what Cathie Woods is betting on too.<br /><br />Thanks to <a href="https://twitter.com/BrianRoemmele">Brian Roemmele</a> who has been talking to me about this for more than a year. He sees ahead better than anyone else I track right now.</p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
<media:content url="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_1244.jpeg?w=1024" medium="image" />
<media:content url="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/02/img_0826.jpeg?w=1024" medium="image" />
</item>
<item>
<title>Now that my career is over…</title>
<link>https://scobleizerblog.wordpress.com/2021/01/08/now-that-my-career-is-over/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Fri, 08 Jan 2021 13:26:29 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9194</guid>
<description><![CDATA[Reading “The Scarlet Letter: A Romance,” back in high school in the early 1980s I never imagined I’d have to wear a letter of shame on my chest while moving around Silicon Valley (don’t understand? Just Google my name and you’ll see my “Letter A.”) That book tells the story of Hester Prynne, who conceives … <a href="https://scobleizerblog.wordpress.com/2021/01/08/now-that-my-career-is-over/" class="more-link">Continue reading <span class="screen-reader-text">Now that my career is over…</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<p>Reading <em>“The Scarlet Letter: A Romance,”</em> back in high school in the early 1980s I never imagined I’d have to wear a letter of shame on my chest while moving around Silicon Valley (don’t understand? Just Google my name and you’ll see my “Letter A.”)</p>
<p>That book tells the story of Hester Prynne, who conceives a daughter through an affair and then struggles to create a new li<span class="has-inline-color has-black-color">fe of repentance and dignity</span>, <a href="https://en.wikipedia.org/wiki/The_Scarlet_Letter">says Wikipedia</a>. Hester’s “Letter A” stood for adultery. </p>
<p>My “Letter A” stands for that, along with abuser and, mostly, asshole. Shame is a heavy thing to bear, for sure. A little heavier than those new Apple headphones. <img src="https://s0.wp.com/wp-content/mu-plugins/wpcom-smileys/twemoji/2/72x72/1f642.png" alt="🙂" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>
<p>Lately, I’ve come to <em>accept</em> that I’m an asshole and that my career is over. </p>
<p>One part of that acceptance is coming to understand that my shame has given me so many gifts, including a 9,000-mile road trip with my kids and a life that doesn’t include traveling. Now that is gone I see just how hard it was to stay sane on the road. So happy we did that road trip in 2018.</p>
<p>Last year was pretty tough, though. At Christmas Maryam and I remembered all our family and friends we had lost. The year took my dad and one of my best friends, along with five others I used to know. I have a feeling more losses are ahead, as COVID keeps going up exponentially. Yesterday alone America lost 4,500. On top of that I was still learning to deal with my new roles as the kids were home all the time, and Maryam was home for the first time in a number of years too. I am tasked with getting them up most days and onto Zoom school, which is what I call distance learning, since both kids are on Zoom. </p>
<p>There was also a problem popping up in my mental health, though. Depression got a deeper hold on me. </p>
<p>Some days I couldn’t get up. I just didn’t want to do anything. And when I was up I had a ton of household and emotional work to do, which just took my energy. Of course that strained relationship between me and my company cofounder, Irena Cronin. I was unfair to her, and sinking because I just couldn’t tell anyone what was going on. </p>
<p>My wife asked me to see a therapist again last Fall because she could see I wasn’t dealing with life well. I did, and he put me on depression medicine. That helped me a lot. In just a few months I’ve lost more than 20 pounds, now more than 50 down from my high four years ago, and it feels like it filled in a bunch of potholes in my soul. </p>
<p>Another part of what I didn’t like about consulting work is that my stock market investing was doing a lot better than what I was doing with Irena. The last year was extraordinary in the market, and brought us good fortune, even while my mental health suffered. Over the last year my investment returns were 15+x times higher that my other income, <a href="https://www.amazon.com/Infinite-Retina-Computing-technologies-revolution/dp/1838824049/ref=sr_1_1?dchild=1&keywords=The+Infinite+Retina&qid=1610111743&sr=8-1">despite writing a critically-acclaimed book about Spatial Computing</a> and doing a lot of consulting for big companies and small.</p>
<p>Which brings me to what is next. In talking about the changes with tons of developers and entrepreneurs I see that the changes are going to be extraordinary and quick. Autonomous cars will become much more available in 2022. Tesla’s Cybertruck will be next-level in all ways. So much new tech is being readied for it. Apple is readying what I have heard is the biggest product introduction of all time. So big that it will come in two parts. Part One comes this year. More on what Apple is up to soon. It’s big and most people, even the smart analysts, haven’t figured it out yet.</p>
<p>Anyway, that brings me to my next project: a science fiction ebook about what life is like in 2022. The premise of the book? That the next 24 months will see more new technology ship than human beings have ever shipped and deep change is about to hit. Not to mention you will give companies a LOT of new data from these things. Privacy will radically change over the next 24 months due to new devices from Apple, Facebook, Google, and others.</p>
<p>I always wanted to try my hand at science fiction but most science fiction, like Star Trek, depicts a future that’s either way off, or unattainable. I wondered if I could write about something much more short term. A family who gets 2022 technology early. That morphed into a fictional neighborhood that Apple, Tesla, and other companies, are using to test out new devices, services, and more. <br><br>I’m noodling around, still at the beginning. It might massively change as I write a new ebook between now and June. And, yes, my “Letter A” is helping me build characters with some depth in human experience. I’m lucky that I don’t need to make money at the moment, so can take the time to do that and continue figuring out how to be less of an asshole. It’s a work in progress.</p>
<p>Anyway, now that my online media career is over it’s OK. I have these new Apple headphones which really rock. Most people haven’t figured out how much Apple is using neural network here. Last week I was talking to someone and a lawn mower started up. The guy on the other side of the conversation told me that the lawn mower was “turned off” within a second or two and he couldn’t hear it, despite it running right next to me. Here’s my son, Milan, wearing a pair. He loves them. He hates VR. Which tells you just about the kind of human factor work Apple is doing and the kinds of things we’ll have to go through as we enter a new paradigm of computing.<br><br>As I write the book I’ll be online a lot less. Hope to be done by June. As for the “Letter A” I am wearing around town? Well, I’ve found another gift it gives: it helps other people deal with their own troubles. Helping others is the only real way out of the burden it brings. I have a long way to go. </p>
<figure class="wp-block-image size-large"><img loading="lazy" width="1024" height="768" data-attachment-id="9201" data-permalink="https://scobleizerblog.wordpress.com/2021/01/08/now-that-my-career-is-over/img_0685/" data-orig-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/01/img_0685.jpeg" data-orig-size="4032,3024" data-comments-opened="0" data-image-meta="{"aperture":"2.2","credit":"","camera":"iPhone 12 Pro Max","caption":"","created_timestamp":"1609604460","copyright":"","focal_length":"7.5","iso":"50","shutter_speed":"0.0082644628099174","title":"","orientation":"1","latitude":"37.272369444444","longitude":"-121.9637"}" data-image-title="Milan Scoble" data-image-description="" data-image-caption="<p>Milan wears Apple’s new AirPods Max headphones.</p>
" data-medium-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/01/img_0685.jpeg?w=300" data-large-file="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/01/img_0685.jpeg?w=1024" src="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/01/img_0685.jpeg?w=1024" alt="Milan wears Apple's new AirPods Max headphones." class="wp-image-9201" srcset="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/01/img_0685.jpeg?w=1024 1024w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/01/img_0685.jpeg?w=2048 2048w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/01/img_0685.jpeg?w=150 150w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/01/img_0685.jpeg?w=300 300w, https://scobleizerblog.wordpress.com/wp-content/uploads/2021/01/img_0685.jpeg?w=768 768w" sizes="(max-width: 1024px) 100vw, 1024px" /><figcaption>Milan wears Apple’s new AirPods Max headphones.</figcaption></figure>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
<media:content url="https://scobleizerblog.wordpress.com/wp-content/uploads/2021/01/img_0685.jpeg?w=1024" medium="image">
<media:title type="html">Milan wears Apple's new AirPods Max headphones.</media:title>
</media:content>
</item>
<item>
<title>This full-body MRI scan could save your life</title>
<link>https://scobleizerblog.wordpress.com/2020/11/01/this-full-body-mri-scan-could-save-your-life/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Sun, 01 Nov 2020 18:54:48 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9175</guid>
<description><![CDATA[This summer a 40-year-old friend and brilliant software engineer, Brandon Wirtz, died due to colon cancer and my dad died of pancreatic cancer too. At first neither of their doctors diagnosed properly (Brandon was frequently getting sick and my dad kept having more and more problems). Ever since Brandon discovered his cancer, I’ve started taking … <a href="https://scobleizerblog.wordpress.com/2020/11/01/this-full-body-mri-scan-could-save-your-life/" class="more-link">Continue reading <span class="screen-reader-text">This full-body MRI scan could save your life</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<p>This summer a 40-year-old friend and brilliant software engineer, Brandon Wirtz, died due to colon cancer and my dad died of pancreatic cancer too. At first neither of their doctors diagnosed properly (Brandon was frequently getting sick and my dad kept having more and more problems). Ever since Brandon discovered his cancer, I’ve started taking healthcare more seriously, wondering if there’s a way to diagnose such diseases earlier.<br><br>Last week a new clinic, Prenuvo, opened near San Francisco, that promises to do just that by doing a full-body MRI scan (Magnetic Resonance Imaging). This is like a high-resolution X-ray machine except it doesn’t use radiation to make its images.<br><br>I was lucky enough to be one of the first to be scanned in its new location (it has been doing such scans for a decade up in Vancouver, Canada) by founder, Dr Raj Attariwala. Here I filmed the consultation with Dr. Raj right after my scans were done.</p>
<figure class="wp-block-embed is-type-rich is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="embed-youtube"><iframe title="Prenuvo, my full-body MRI scan. Yours can save your life." width="1100" height="619" src="https://www.youtube.com/embed/Dnru3qpHSSE?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>
<p>The process? You pay $2,500. You spend an hour inside an MRI machine. For me, it was a chance to hold perfectly still for an hour, while I listened to the machine whir and buzz around me. After the hour, it takes a few minutes to process your images and then you sit down with a doctor, like I did here.<br><br>Luckily for me I got a pretty clean bill of health but you can see this is a powerful diagnostic tool to help doctors find dozens of problems before they become untreatable. Everything from heart disease to a variety of cancers. You can see how Dr. Raj walks through my entire body, including my brain, looking for problems that I’ll need to work on. He did find one with me, my mom had a bad back, and it looks like I’ve been blessed with the same problems, and he told me to do exercises to strengthen my core muscles to minimize that problem in the future.<br><br>In talking with cofounder/CEO Andrew Lacy the company has developed its own MRI machine to do these scans. He told me that most other MRIs are used only for specific body parts, usually after a cancer or problem has already been found. Prenuvo, he told me, has modified the software running the MRI machine to do specialized full-body scans that other machines can’t do easily. Also, his team is using these images to build machine learning to assist the doctors in helping find various problems and, also, in its plans to scale this to more people over time (the San Francisco location has two scanners that can do two people an hour, the company has plans to open more locations and do more scans per hour, but that will need more AI work, and a training of doctors to look for problems when they are early, rather late-stage like they usually see).</p>
<p>For me it’s amazing to see inside your own body for the first time and the company gives its customers all scans on a mobile app that you can explore on your own time later. It also sends the scans to your primary-care physician, or to other doctors for second opinions.</p>
<p>You can learn more about this service at <a href="https://www.prenuvo.com">https://www.prenuvo.com</a>.</p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
</item>
<item>
<title>Getting ready for VMworld with AI deep dive with 14-year employee</title>
<link>https://scobleizerblog.wordpress.com/2020/09/23/getting-ready-for-vmworld-with-ai-deep-dive-with-14-year-employee/</link>
<dc:creator><![CDATA[Robert Scoble]]></dc:creator>
<pubDate>Wed, 23 Sep 2020 14:56:40 +0000</pubDate>
<category><![CDATA[Personal]]></category>
<guid isPermaLink="false">http://scobleizer.blog/?p=9170</guid>
<description><![CDATA[Business of the future will need to be more predictive. That’s what VMware’s Justin Murray, a long-time VMware employee told me here as he explained the latest in AI and Machine Learning that he’s seeing evolve. The folks who run VMware’s huge conference, VMworld (happens September 29-October 1), got interested in me after reading my … <a href="https://scobleizerblog.wordpress.com/2020/09/23/getting-ready-for-vmworld-with-ai-deep-dive-with-14-year-employee/" class="more-link">Continue reading <span class="screen-reader-text">Getting ready for VMworld with AI deep dive with 14-year employee</span> <span class="meta-nav">→</span></a>]]></description>
<content:encoded><![CDATA[
<figure class="wp-block-embed is-type-rich is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<div class="embed-youtube"><iframe title="Getting ready for VMworld with AI deep dive with 14-year employee" width="1100" height="619" src="https://www.youtube.com/embed/BdSToh2MnmA?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe></div>
</div></figure>
<p>Business of the future will need to be more predictive.<br><br>That’s what VMware’s Justin Murray, a long-time VMware employee told me here as he explained the latest in AI and Machine Learning that he’s seeing evolve.</p>
<p>The folks who run VMware’s huge conference, VMworld (happens September 29-October 1), got interested in me after reading my latest book, “The Infinite Retina,” which is how augmented reality and artificial intelligence, along with a few other technologies that I call Spatial Computing, will radically change seven industries over the next decade.<br><br>You see this predictive nature of AI in things like robots, autonomous cars, and, even, other things like Spotify, which uses AI to build playlists. That predicts what kind of music we would like to listen to. Autonomous cars predict the next action of both people and other cars on the streets. The AI inside is always trying to answer questions like: “will a child walking on the side of the road try to cross the street in front of us?” Properly predicting what is about to happen on the road is important and, I ask Murray, if that same predictive technology is what he’s thinking will be used elsewhere in businesses?</p>
<p>Murray says “yes,” but then goes a lot further, and predicts what some of the hot discussions will be at VMworld next week.<br><br>For instance:<br><br>1. Every major cloud provider, like Amazon’s Web Services, Microsoft’s Azure, or Google’s Cloud Services, is buying tons of NVidia’s powerful GPUs for their datacenters to support these new, predictive, AI services that businesses are starting to build.</p>
<p>2. The AI architecture and tooling stack that runs on these is seeing sizeable changes, and NVidia and VMware will make some announcements there next week.</p>
<p>3. Powerful new AI supercomputers are now being built because NVidia cards are being “connected together” in a powerful new way to make new workloads possible.</p>
<p>Why do I care, especially since usually I’m interested in new startups or consumer electronics gadgets?<br><br>Well, let’s walk through my life. Recently I got a June Oven. You put a piece of toast in it, or a piece of salmon, and it uses a small camera and an NVidia card inside, along with machine learning-based software, to automatically recognize what food I am trying to cook or bake. It’s magic. Plus, I never burn my toast anymore the way we used to because I often didn’t get the settings right.<br><br>Or, look at the new DJI Osmo 4 I used to do the intro to this video. On there is three little motors, and the instructions for how to move those motors is generated, in part, by machine learning that is constantly evaluating how best to steady my iPhone.<br><br>Finally, look at my Tesla. Murray told me that there’s more than a dozen AI-based systems running on that, and it drove most of the way to VMware’s headquarters in Palo Alto, CA, which makes my drives more relaxing (particularly in traffic where my car does all the stop-and-go duties) and safer.<br><br>Already AI has radically changed my life and most people in the industry say AI is just getting going. One reason VMware is compensating me to do this series of posts is because about a decade ago I was the first to see Siri, which was the first AI-based consumer application to come to market. My posts back then kicked off the AI age but a decade later AI has started deeply falling in price and is getting simpler to do, so it’s being used in a lot more new workloads.<br><br>You might not realize just who VMware is, but you probably use one of its services everyday, or, rather, the companies you deal with everyday use VMware to run their businesses. When I worked at Rackspace, a major cloud computing provider, for instance, we used VMware all over the place in our datacenters. “VM” stands for “Virtual Machine,” and VMware is the one that popularized that term for technology that can split up a physical computer into tons of “virtual” computers (or join them together with millions of other machines to build a supercomputer). Today that technology is used to do a bunch of things, from letting you manage your laptop better and run it more safely, to managing huge businesses, to soon managing new Spatial Computing infrastructure and devices (I wrote about such in my latest book).</p>
<p>All of this will be discussed at VMworld, which is a huge free virtual event, with more than 100,000 attendees and hundreds of sessions, covers not just what is happening here in AI, but across a range of technologies that businesses use everyday, from security-focused ones, to data-center-management focused ones. If you like this conversation, which is just one out of thousands of VMware employees or customers you will meet at VMworld, <a href="https://www.vmworld.com/en/index.html?src=sp_5f2306bf7d71f&cid=7012H000001Oow4">register for your free attendee badge here</a>. </p>
<p>I don’t even need an AI to predict that you’ll find at least a few of the sessions out of the 900 offered useful for your business, see you on the 29th!</p>
]]></content:encoded>
<media:content url="https://1.gravatar.com/avatar/d80ee397069a8fe5f564e3bc9e4a9988d53610a443c0863d86c5b9e159bd6c4f?s=96&d=https%3A%2F%2F1.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D96&r=G" medium="image">
<media:title type="html">Scobleizer</media:title>
</media:content>
</item>
</channel>
</rss>
If you would like to create a banner that links to this page (i.e. this validation result), do the following:
Download the "valid RSS" banner.
Upload the image to your own server. (This step is important. Please do not link directly to the image on this server.)
Add this HTML to your page (change the image src
attribute if necessary):
If you would like to create a text link instead, here is the URL you can use:
http://www.feedvalidator.org/check.cgi?url=http%3A//scobleizer.com/feed/