Virtual CES 2021 Part 2

When we attend CES in person, our primary goal is to learn about the latest technologies and think about how they will impact us. There is so much coming at you that you can’t take it all in and quickly learn to ignore the political speech and dive right into the exhibits. This year, everything was scripted and led, so there was considerably more wasted time trying to boil down what’s relevant.

Between tech sessions, I’d sort tiles and click each one to discover anything new and interesting. There are a lot of duds, but that’s also what you see when you walk the floor of CES. These are companies that are a solution looking for a problem to solve and because startups make up about a third of the show, you see a lot in that category. You come across some that are useful for now, such as a COVID home testing kit. I doubt it has a bright future given that this will at some point go away.

Here is what I thought was interesting in no particular order.

A company called 3diTX is launching a 3D weaving technology that will allow weaving to move in multiple directions. Think fabrics without seam as one continuous weave. I found that to be interesting, given that just about every science fiction movie you will ever see shows clothes with seams.

LG made the case that everything technology-related is a platform on which to launch something else. I thought that was kind of an interesting concept but the more I thought about it, so is everything else. They talked about their rollable OLED displays and how they can be lowered much like a window shade or put on a collapsable frame. We first saw that film back in 2005 at CES. Jeff Higashi and I were looking at it together. It was only last year that we saw it as a real product, but I still don’t see a huge use case for it yet. I do see a case on curved surfaces, poles, places like that, but as a roller shade style screen, not so much.

One of the more interesting discussions I heard at CES was Amnon Shashua talking about software solving single solutions which, as he pointed out is not how humans apply their own thinking. If we come up with a solution to a problem, humans naturally look for other ways to apply the same solution to other problems. Software won’t do that on its own, but the argument is that it is probably what’s next. Software that can learn one solution and then automatically apply that solution to other problems would open a whole new world. It was an interesting concept to ponder.

42Maru, Inc. launched a new Q&A software platform that I thought was very interesting. I’ve thought for years that if you could keep drilling down to everything in its most binary form, yes or no, could you solve every problem? The idea is to keep diving deeper until you not just arrive at an answer, but the only answer. That concept fascinates me.

About four years ago, I was getting the impression at CES that Sony was falling behind the other big electronics hardware companies. LG and Samsung were making big leaps forward with 4K TVs and other appliances, but then Sony suddenly made a massive leap forward and they seem to be advancing faster and faster every year. You can find my writings about the X1 chip on my CES summary from 2020. This year they announced their XR chip for their Bravia televisions with even more capability than before. Rather than write about it, I added the link instead.

Sony is about to release their Airpeak Drone using a combination of AI and robotics. It’s designed specifically for Alpha or “A-series” cameras. The new A7SIII camera has impressed the world of videographers, and building a purpose-built drone makes a lot of sense. I’m sure the content will be extraordinary. It’s not a small drone either. I’m guessing it won’t be cheap. Crashing that rig with an A7SIII and a nice lens will set you back about $6k or more, not including the price of the drone. I imagine it will only be in the hands of the very best pilots and under recoverable conditions.

Sony put up a lot of content around CES and you can find most of it on the www.sony.com site. They did impress me this week.

Abbott Labs is also producing another rapid COVID test. Coming to make with something like that, is not a risk I’d like to take, given our advancement in vaccines.

Caterpillar Tractor had a video of their autonomous mining operation in full swing with massive dump trucks going to and from the mining site all on their own. This falls in line with my believe in the direction of autonomous robots, from lowest risk to highest; Roomba, followed by in-building delivery robots, followed by closed campus small robots, followed by closed campus large environments, where Caterpillar is today. This stuff is in use right now. It’s past proof of concept and is already in use. They are also using scaled down swarming technologies to run multiple machines all doing the same task at the same time.

John Deer had an interesting discussion about the state of farming and the precision of their equipment. Their GPS correction technologies has seed planting accuracy of within two centimeters and can quickly adapt to soil conditions. Apparently a 500 acre plot of land can have three or four different soil conditions all on that same patch. The host pointed out that they can plant 500 acres in a day up from 30 acres not that long ago. Tractors now plant and harvest at a speed of 10 MPH.

We didn’t see anything from them that was fully autonomous yet, but what impressed me was the accuracy and ability to so closely monitor crops to get optimum yield. A point of trivia, a weed can grow four inches in a day.

Now, this was cool…

(Screen shot, not live link.) These guys were 6,000 miles apart.

(Screen shot, not live link.) These guys were 6,000 miles apart.

 I loved the interview of Amnon Shashua because of how they used technology. The two were 6,000 miles apart and it was all shot on two different studios using green screen. They had almost identical chairs, but everything else was shot separately. Here are some photos of the process.

(Screen Shot, not a live link)

(Screen Shot, not a live link)

(Screen shot, and not a live link.)

(Screen shot, and not a live link.)

(Screen shot and not a live link.)

(Screen shot and not a live link.)

(Screen shot and not a live link.)

(Screen shot and not a live link.)

(Screen shot and not a live link.) The image on the right with the interviewer is all prerecorded. The subject then has to just answer the questions as if they were a live interaction.

(Screen shot and not a live link.) The image on the right with the interviewer is all prerecorded. The subject then has to just answer the questions as if they were a live interaction.

Part 3 coming up…

Previous
Previous

Virtual CES 2021 Part 1

Next
Next

Virtual CES 2021 Part 3