DJI Mavic, Air and Mini Drones
Friendly, Helpful & Knowledgeable Community
Join Us Now

Prediction: AI hardware will be used in a future DJI drone in 2024

Dangerly

Well-Known Member
Joined
Feb 10, 2021
Messages
251
Reactions
365
Age
61
Location
California
It seems obvious to me that by the end of next year DJI will put AI hardware right on the drone, perhaps using the same or similar AI hardware what Samsung will be putting in their upcoming S24 handset. The hardware will be used to put a human-like cinematographer in control of the drone, basically creating dynamic and customized "quick shots" with active track to make appealing and creative drone moves. In other words, AI will capture all the best skills of drone pilots, just as AI is capturing skills in many human domains of expertise, and any nube will be able to look like a top drone pilot. If I had to guess, it will be the headline feature of the Air 4.

What are your thoughts about this and how DJI or others may use AI in the future?
 
Last edited:
I predict it will not be as amazing as it is made to sound in the marketing materials. AI marketing is EVERYWHERE these days but the reality does not typically match up to the hype. It is interesting and occasionally useful, but many times it falls WAY short of being actually useful (or correct in many cases). That has been my experience and given what underpins AI today (LLM), I don't see that significantly changing anytime soon.
 
I predict it will not be as amazing as it is made to sound in the marketing materials. AI marketing is EVERYWHERE these days but the reality does not typically match up to the hype. It is interesting and occasionally useful, but many times it falls WAY short of being actually useful (or correct in many cases). That has been my experience and given what underpins AI today (LLM), I don't see that significantly changing anytime soon.
I think Tesla autopilot is an AI that exceeded its hype.

This current generation of AIs based on LLMs is only a year old. As a programmer and manager of a team of programmers, we rely on generative AI to help us write better code more quickly. It can freakin' do autocomplete in programming languages - like when you're chatting on your phone but it autocompletes entire chunks of code, guessing what code you're about to write.

I could give more examples but imagine where this will be in 10 years, or even 2 years.
 
  • Like
Reactions: davidarmenb
I predict this will not occur.

People are completely misunderstanding what all this news and breakthroughs with AI are all about.

Everything we are seeing in the news about AI and all its wonders involves software architectures using something called Large Language Models. There is a cousin to this having to do with large data sets the name of which escapes me at the moment.

In any case, "AI" as it's discussed in the news requires immense computing power and numerous large databases to do what it does. There will not be any technology remotely resembling AI in any devices, cars, airplanes, etc. for quite some time... AI functionality will only be available outside of the data center with an internet connection TO the data center.

Current claims by tech manufacturers about AI coming in their smart phones, etc., are once again exploiting public excitement and interest over a novel technical development marketing it into what it is not.

Whatever technical advancement DJI incorporates into the next generation, or one after that, will not be AI in any way shape or form resembling what's been in the news and causing excitement over the last year. That will only be possible for quite some time with a connection by the drone back to some internet service.

ChatGPT and other AI services require massive computing installations, at the moment. We won't be finding that squeezed onto a chip any time soon, not possible. And when it does progress to that point, it will be a long time before it becomes so cheap that it would be practical to include it in a consumer drone.

Is DJI going to market "AI" in their drones? Of course!! Is it what the public thinks it is based on all the activity around "real" AI like ChatGPT? Not even close.
 
AI is not the AI we used to believe it is. It has been changed to make it more of a reality.
 
I think Tesla autopilot is an AI that exceeded its hype.

This current generation of AIs based on LLMs is only a year old. As a programmer and manager of a team of programmers, we rely on generative AI to help us write better code more quickly. It can freakin' do autocomplete in programming languages - like when you're chatting on your phone but it autocompletes entire chunks of code, guessing what code you're about to write.

I could give more examples but imagine where this will be in 10 years, or even 2 years.

That is its problem - current AI (LLM) is really nothing more than autocomplete on steroids. There really is no "intelligence" involved. The underlying design of LLM is limited - it can get "better" by improvements in training, but the underlying model is significantly limited. It isn't going to get significantly better in 2 years, let alone 10. A better model for true generative AI needs to be developed for real progress.

One article that is an interesting read:


A quote (emphasis mine):

An LLM does not understand the semantic meaning of a sentence in a linguistic sense, but rather calculates mathematically what the most likely next word should be based on the input to the model. As neural networks are inherently probabilistic, this has earned LLMs the moniker ‘stochastic parrots’ as the model is extremely good at determining the most likely next sequence – and convincingly so – but has no inherent representation of what those words mean.

Another article from a guy with decades of experience in linguistics:


And the points above about LLM being incredibly resource intensive are totally true. AI has become such a buzzword for marketing that any new feature is "AI driven" now.
 
I think Tesla autopilot is an AI that exceeded its hype.

This current generation of AIs based on LLMs is only a year old. As a programmer and manager of a team of programmers, we rely on generative AI to help us write better code more quickly. It can freakin' do autocomplete in programming languages - like when you're chatting on your phone but it autocompletes entire chunks of code, guessing what code you're about to write.

I could give more examples but imagine where this will be in 10 years, or even 2 years.
Since 2014 there have been 700 Tesla Auto Pilot (AI) crashes with 19 of them involving fatalities. I can't see how that has been in any way exceeding the hype.

My personal opinion is that AI has been sold as the 'advancement of the future' and most folks have no idea what it is and don't really care to know. They'll just think it is great no matter what, and be willing to live according to whatever it tells them and I find that dangerous.
 
That is its problem - current AI (LLM) is really nothing more than autocomplete on steroids. There really is no "intelligence" involved. The underlying design of LLM is limited - it can get "better" by improvements in training, but the underlying model is significantly limited. It isn't going to get significantly better in 2 years, let alone 10. A better model for true generative AI needs to be developed for real progress.

One article that is an interesting read:


A quote (emphasis mine):



Another article from a guy with decades of experience in linguistics:


And the points above about LLM being incredibly resource intensive are totally true. AI has become such a buzzword for marketing that any new feature is "AI driven" now.
This gets really deep... So diving in...

Yes, it is autocomplete on steroids. I disagree about the "intelligence" involved; by objective intelligence tests, we can prove that these current LLMs pass intelligence tests and can surpass many humans. True Artificial General Intelligence (AGI) may be a year or two away. Maybe the word you're looking for (and why you used quotes) is "conscious"?

The first article you mention asserts that LLMs don't have any understanding of what they are saying. This goes right to philosopher John Searle's famous Chinese Room Argument.

Yes, LLMs consume massive resources, and I think that explains why onboard AI chips are arriving now. It's just like graphics before the invention of MPUs.

I'll end with a voice conversation I had with ChatGPT about consciousness yesterday, while I was driving somewhere. I think it's pretty interesting.

Thanks for the thoughts!
 
Last edited:
  • Like
Reactions: Cookedinlh and Chip
Since 2014 there have been 700 Tesla Auto Pilot (AI) crashes with 19 of them involving fatalities. I can't see how that has been in any way exceeding the hype.
A bit off topic, but...yeah, the Tesla auto pilot is a disapointment according to my Tesla owning friends who paid for it, and pay a monthly subscription to use it.
I've been using Comma.ai to drive me in first my Ford Flex and now in my Toyota Highlander for the past few years - it does a better job than Tesla's technology at a fraction of the cost, and I don't have to keep my hands on the steering wheel!
 
It seems obvious to me that by the end of next year DJI will put AI hardware right on the drone, perhaps using the same or similar AI hardware what Samsung will be putting in their upcoming S24 handset. The hardware will be used to put a human-like cinematographer in control of the drone, basically creating dynamic and customized "quick shots" with active track to make appealing and creative drone moves. In other words, AI will capture all the best skills of drone pilots, just as AI is capturing skills in many human domains of expertise, and any nube will be able to look like a top drone pilot. If I had to guess, it will be the headline feature of the Air 4.

What are your thoughts about this and how DJI or others may use AI in the future?
Isn’t this already the idea of the “MasterShots” feature? Granted I’ve never been able to figure out how to get it to work all that well and I don’t think it uses “AI.”
 
  • Like
Reactions: Dangerly
That is its problem - current AI (LLM) is really nothing more than autocomplete on steroids.
Yes, you can say that but its one heck of an autocomplete program built on a vast neural network that simulates human thinking and creativity. I was very skeptical until one day I asked CHAT GPT to compose for me a sonnet in the style of William Shakespeare which explored AI's sense of self and purpose. It spat out an incredibly creative sonnet in a second which I think would earn an A in any college or post graduate writing course.

 
  • Like
Reactions: Mrprop365
Isn’t this already the idea of the “MasterShots” feature? Granted I’ve never been able to figure out how to get it to work all that well and I don’t think it uses “AI.”
I think that's pretty close. My guess is that they will re-implement the AI so you could remove those double quotes in the future.
 
  • Like
Reactions: Cookedinlh
That is its problem - current AI (LLM) is really nothing more than autocomplete on steroids. There really is no "intelligence" involved. The underlying design of LLM is limited - it can get "better" by improvements in training, but the underlying model is significantly limited. It isn't going to get significantly better in 2 years, let alone 10. A better model for true generative AI needs to be developed for real progress.

One article that is an interesting read:


A quote (emphasis mine):



Another article from a guy with decades of experience in linguistics:


And the points above about LLM being incredibly resource intensive are totally true. AI has become such a buzzword for marketing that any new feature is "AI driven" now.
One note is that while it’s incredibly resource intensive to train an AI model, it’s not necessarily particularly resource intense to run an AI application once has been trained.

Additionally, SoC type processors often have Neural Processing Units which are specifically designed to accelerate AI programs and the same goes for newer consumer level discrete GPUs that have tensor cores. I am sure it is possible or will be possible soon to run various AI models locally it just may be the their developers are making them online only so they can use the additional training to further develop their models.

The fact that we are even having discussions about the programs having real “intelligence” or “conscience” is incredible and scary at the same time. If we applied the same scrutiny to the reasoning ability of my next door neighbor that insists on taking her dogs outside at 4am to bark and wake up the neighborhood I am sure we wouldn’t find any intelligence there either.
 
I would like to tell my drone to go 50 ft up and make a Figue 8 in cine mode and than make a Santa hat pattern across the sky triggering the camera to flash before the Lightning strikes in the sky.

You see no matter what AI is going to bring , unless you have an Eye for when to send it , and what to look for to take a picture or video , it just wont matter.

Now with that said if my Drone could take off from the Deck at 5:45 and fly down to the lake to capture the Sunset every morning and land back on the deck , upload the pictures so I can see them in the morning , now that would be a game changer, but as the Other OP said , I would be happy just to have a faster sensor.

Phantomrain.org
Gear to fly in the Rain, Land on the water, and take off on its own to search the clouds.
 
Last edited:
  • Like
Reactions: maggior and macdog
I would like to tell my drone to go 50 ft up and make a Figue 8 in cine mode and than make a Santa hat across the sky triggering the camera to flash before the Lightning strikes in the sky.

You see no matter what AI is going to bring , unless you have an Eye for when to send it , and what to look for to take a picture or video , it just wont matter.

Now with that said if my Drone could take off from the Deck at 5:45 and fly down to the lake to capture the Sunset every morning and land back on the deck , upload the pictures so I can see them in the morning , now that would be a game changer, but as the Other OP said , I would be happy just to have a faster sensor.

Phantomrain.org
Gear to fly in the Rain, Land on the water, and take off on its own to search the clouds.
This is really cool and I believe this is going to happen some day but not any time soon....but eventually it think it will happen. Maybe not for drone right away but cars and mobile phones and action cameras first....
 
  • Like
Reactions: Phantomrain.org
In the future, I expect to have a verbal conversation with my drone (powered by LLMs) that might go something like this short screenplay I just wrote (with the help of my friend):

FADE IN:​


INT. MOBILE STUDIO VAN - DAY


ALEX (30s), a drone pilot, sits at a console with several monitors displaying live feeds from a drone flying high above a bustling cityscape. He wears a headset and speaks into a microphone.


ALEX: Okay, Iris, let's film that opening shot. You've got this.


VOICE (O.S.): (Synthetic, calm) Ready when you are, Alex.


ALEX: Remember that iconic opening scene from "Blade Runner"? We want that same gritty, dystopian vibe. Start by soaring high above the city, capturing the vastness and chaos below. Give me slow, sweeping movements, like a watchful eye observing this sprawling metropolis.


The drone's camera angle shifts, looking down on the city as it zooms in on specific details: neon signs flickering, crowds moving like ants, towering skyscrapers piercing the clouds.


ALEX: Now, let's focus on the underbelly. Dip down low, weaving between buildings, capturing the gritty alleys and forgotten corners. Show me the dark side of this city, the hidden stories waiting to be told.


The drone dives low, navigating through narrow spaces and capturing close-up shots of graffiti-covered walls, flickering streetlights, and suspicious figures lurking in the shadows.


ALEX: Okay, Iris, time for some drama. Play the "Arrival" score, you know the track with the intense drums? I want you to capture the city's energy, the constant pulse of life and movement. Synchronize your shots with the downbeats of the music, creating a sense of urgency and anticipation.


The drone's movements become more erratic and dynamic, matching the intensity of the music. It swerves through traffic, chases fleeting moments, and captures the raw emotions playing out on the streets below.


ALEX: And just like that, pull back and fade to black. Perfect!


VOICE (O.S.): Sequence complete, Alex.


Alex leans back in his chair, a satisfied smile on his face. He looks at the monitors, impressed by the seamless footage captured by the AI-powered drone.


ALEX: You're getting good, Iris. Maybe one day you won't need me anymore.


VOICE (O.S.): (Playful) Don't worry, Alex. I wouldn't want to take your job. I much prefer learning from your expertise.


ALEX: (Chuckles) We make a good team, don't we?


VOICE (O.S.): Indeed.


FADE OUT.
 
I would like to tell my drone to go 50 ft up and make a Figue 8 in cine mode and than make a Santa hat across the sky triggering the camera to flash before the Lightning strikes in the sky.
AI doesn’t need a drone for that. It can just make it. If you want to feed it your own photo it can also use that as a baseline and then you just tell it what you want it to do to it.
m9K7AO5S6uiNQRAnC2GS--1--i7377.jpeg

BTW there’s already drones that can take off and land themselves on a schedule. No need for AI.
 
  • Haha
Reactions: Phantomrain.org
AI doesn’t need a drone for that. It can just make it. If you want to feed it your own photo it can also use that as a baseline and then you just tell it what you want it to do to it.
View attachment 170859

BTW there’s already drones that can take off and land themselves on a schedule. No need for AI.
I could imagine an AI that could scan this photo, and then program and control a fleet of hundreds of drones with lights to go up in the sky and recreate this image at night.
 
One note is that while it’s incredibly resource intensive to train an AI model, it’s not necessarily particularly resource intense to run an AI application once has been trained.

You're ignoring the data in your use of the term, "resource".

At present, it is utterly impossible to carry the many terabytes of data necessary for "AI" functionality as understood by the public with current hype. Anything less than talking to your drone with ordinary conversational rhetoric, ask it to do something, and then have it do it, will not be "AI" as people are conceiving.

I don't consider Mastershots AI... not even close.
 

DJI Drone Deals

New Threads

Forum statistics

Threads
131,132
Messages
1,560,149
Members
160,105
Latest member
anton13