Forums > General Discussion   Shooting the breeze...

Things you'd do with a self driving car

Reply
Created by FlySurfer > 9 months ago, 30 Nov 2017
FormulaNova
WA, 14734 posts
2 Dec 2017 5:54PM
Thumbs Up

Select to expand quote
nicephotog said..


FormulaNova said..






nicephotog said..







FormulaNova said..
Seriously, everyone must be thinking it would be awesome to be pissed and have the self driving car drive you home.

Also on my favorites would be "driving" to Perth from Sydney. Not much of a drama really when you can watch TV, eat, sleep, read a book, as the miles pass by. I guess you would have to wake up for the fuel stops, but I think I could live with that. You could do it in record time.

Having said all this, I think the most impressive journey would be the work commute. No problems worrying about traffic when you are in the back having a nap or watching something on TV.















* ..."awesome to be pissed and have the self driving car drive you home"...
* ..."watch TV, eat, sleep, read a book, as the miles pass by"...
No actually, they did say that these vehicles do require "full awareness" while in use because some input is required for the vehicle to understand it is doing the right thing, and that is not true of being pissed , and not true in some split second emergency in some situations !
economictimes.indiatimes.com/small-biz/security-tech/technology/driverless-crashes/articleshow/56510821.cms

As a computer programmer myself i can tell you what to look for alike looking for the problem in a black box of an aircraft for information!
Sensors for example only read back what they get, and when constructed the a sensor unit and its specific job is calibrated to quality boundary of reception band.
However, sensors can go wrong as too components can change their values under particular physical environment conditions.
To make matters worse, the information is sent through what is known as a "core" of a CPU which is responsible for sorting, storing and matching data and committing operation upon that data in precedence simultaneously by which in that "scheme" one sensor is one thread (roughly) , and as management of threads in prioritization is called, to stop a thread is called an "interrupt". Prioritisation and backlogging can be a serious problem.

What they advertise in things such as driver-less buses in operation at this time is they have extremely defined routes. However one of those driver-less buses had quite a slammer "of its own failure" in the past year or two (and others that were not the driver-less vehicles fault e.g. recently in Las Vegas) !

My suggestion is that someone is sipping coffee at their same destination from same start place half way through your back breaker between Perth and Sydney !
windsolarhybridaustralia.x10.mx/PPL-light-aircraft.pdf

more... (not enough information by statistics as there are very few driver-less vehicles to commit that)
www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html

The greatest subtlety about driver-less vehicles and ordinary road accidents by human error is the immense complexity something is trying to "A.K.A." "understand and simulate in real world actions" is really too human a problem for the obscure unique events of environment change. Hazard a guess it would require some extremely human ideals of mind to "understand means to end to traverse some obstacles carefully and sensibly enough" in face of non standardized route occurring or freak weather event

(note - Believe it or not i actually know a vehicle should weigh "2 ton" ( 1814 . 37Kg ) on some good roads to not be blown off the road in good weather with a perfect good surface !!! ).

============== I do not give them the point , it would require a "super cradle computer" to properly assess the information at "time speed" required !
en.wikipedia.org/wiki/Supercomputer

NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"

one byte is (generally eight bits)

1 mega byte = 1048576 bytes

8 x 1048576 = 8388608 "bits" (one or zero) IN A MEGA-BYTE

120 Kmh = 3333.333333333333 milimeters - p/second

8Ghz (8 thousand million times per second) 8000000000

1/1000 th of a second 8000000

each "bit" (either one or zero) through the CPU instruction core is one cycle unit

THE FOLLOWING WOULD BE CLASSIFIED INFORMATION SO THESE VALUES ARE "SUPPOSE"
A "colour image" of 12 mega pixel can be said to be 12,000,000 x 3 = 36,000,000 (red,green,blue)
A radard image can be said to be grey scaled and 12,000,000 bytes
FOR IMAGING 48, MILLION BYTES OF INFORMATION

Simply to look at the picture will require buffering so to be putting it through the core will mean x 2 for "TIME"

2 x 48 million = 96,000,000 bytes

then interrupts must stop the thread to examine each eight byte set so an interrupt will come with the eigth bit making x 3 for "TIME"

but then the instruction index has to set the CPU core circuit to commit operation on the byte data using the stack and "program data"-CALLED INSTRUCTIONS that are sitting in a separate set of software byte data lcation

so we reach things logically x 6 (much more really) for time to read and operate on the data !!!


48 million x 6 = 288,000,000 bytes of DATA only total = 1728,000,000 bytes x 8 = ( 13824000000 bits (or one cycle beat))


1/1000 th of a second 8,000,000 cycles of CPU core beat

8,000,000 eight million times a "thousandth of a second"

13824000000 divided by 8,000,000

1728 sets of thousandths of a second for data

1728 / 1000 leaves sets of one second = 1.728 SECONDS


3333 (MM Per SECOND) x 1.728 = 5759 mm

================

SOMETHING LIKE .................. (approximately 5 meters traveled with an 8Ghz core CPU)

That's actually "a hell of a long way traveled" in popping the question on the console if something is wrong !!!

5.759 meters traveled (@ 120kMH) to complete the a basic SENSOR RECEPTION AND SOFTWARE PROCESSING !

================


I once figured that my hard disk over a few of hours at 5200 RPM to defrag my disk, the rim edge of the disk at 5 inch diameter travels around 700 Km to complete the task

GIANT suppose but there is some of the problem

The strategy computer scientists use to gain more CPU'S to make a virtual cradle super computer is to Network a set of computers.









I don't mean to be insulting, but are you sure you are a computer programmer? You go into a lot of detail about trivial things, but the way you evaluate information does not have to happen the way you describe.

Why would you need to have interrupts when you are processing that amount of data at speed? Seriously, I think you are not at the level required for understanding this problem.

I am sure there are a lot of advances in computer vision, and its not as simple or as complex as 'sending each bit through the core'.

I thought this was meant to be a fun thread anyway.

If I am going to have to sit there and monitor the thing, I may as well let my wife do it, and wake me when we are there. She can sleep while I sail and be refreshed for the drive back. If I drive to Perth from Sydney, how many wives do I need?







@ FormulaNova
Here is a quote from my post ..."NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"...
If you bothered to read the heading above the maths i just quoted
from my post you would not need to bother with much of your post and its question,

@ FormulaNova Second, obviously you have no idea how a computer operates , neither imaging !
I am a computer Programmer ! I have written two freeware desktop programs, one of them an imaging program !

Here is a link to the imaging programs' code in an older version than present, to see the code properly requires downloading the file and using Adobe Acrobat PDF reader to view it with zoom.
www.scribd.com/doc/270239630/Sidewinder-Photo-Colour-Balancer


(Quote FormulaNova) Why would you need to have interrupts when you are processing that amount of data at speed?

(to operate on data e.g. measure it or change it or correlate it requires going through the instruction circuits in the core - e.g. that is what that is to be a CPU , that is what happens , you cannot have a program or data without that - anything else as an "electronic computer" is impossible !!!!) One single core allows only one thread(sub program instruction set indexed on the stack) in operation , to work on other data from other sensors in threading in a CPU the process thread is stopped (interrupted) after a particular quantity of time on the thread , and another set of instructions loaded for another thread(sub program of the OS) to work on ,,and so on until all of it is completed.
That action is multi threading. Because that problem is too slow, new CPU's now carry multi cores so they do not need to stop for other sub programs run simultaneous,

Of the maths, my imaging program speed in a 24 megapixel rates around the speed and data throughput similar to the maths example.
windsolarhybridaustralia.x10.mx/LinuxSideWinderPhotoColourBalancerConfigInstal.html
With the driver-less car , its just a look at how long it takes with an ordinary high speed CPU to process a sensors information and how much crap the vehicle would be in, in an emergency situation at speed (how far it may have traveled before it figure what its obstacle IS/WAS [ non descript problem on the sensor ]). In computing that is called "matching" alike with regular expressions, condition expressions , booleans.



Whichever way you look at it, for quite some time to come it would be a good idea to "be awake and monitor" the vehicle and environment as the Tesla incident in Florida shows.



Okay, no problem. Its just the language you used seemed a bit unusual.

I think the way you view the problem is probably the way a software engineer would view it. A very serial, logical way to do it. Using general cores, this is how you would approach it, but why not use hardware that can do it much more easily?

From a hardware designers perspective, you would see it a little bit differently, and make the hardware do a lot of the hard stuff.

You can have frame buffers, so there is no need to even 'accept' the data. It will just be there. You can work on it as needed, and embed some of the smarts in the hardware. Why write code to find patterns when you can create the hardware to do it, and do it much faster and in parallel.

Modern CPU's are generic. You can achieve a lot more in the same silicon by doing things in hardware constructs. Want to multiple every number in an array by 27.5? Do it in software can take a lot of resources. Do it in hardware, and it happens much quicker.

If you have enough 'cores' you can do lots, but if you have a requirement to move data from one location to another and operate on it, with huge amounts of data, its going to be much more efficient to create a separate hardware function, or even a dedicated 'processor' to do just that,

I am sure machine vision uses a lot of parallel processing and custom logic. It just makes sense to do it that way. As an example, you don't need to see an exact image of what the car is seeing elsewhere in your program, you just need to identify things like lane markers, other cars, and people, and then pass that data upwards.

I remember back to the eighties, when computer graphics was still relatively primitive. Everything was done in software. Then someone figured out that you can move blocks of data around and manipulate them as you do so, and do it in hardware. The result was much better graphics than thought possible, with relatively slow processors. DMA has been around for ages, but to modify the data as you do so can save a lot of processing power.

I think you will see much more advancement in self-driving cars than you would expect.

nicephotog
NSW, 251 posts
2 Dec 2017 9:22PM
Thumbs Up

Select to expand quote
FormulaNova said..




nicephotog said..






FormulaNova said..










nicephotog said..











FormulaNova said..
Seriously, everyone must be thinking it would be awesome to be pissed and have the self driving car drive you home.

Also on my favorites would be "driving" to Perth from Sydney. Not much of a drama really when you can watch TV, eat, sleep, read a book, as the miles pass by. I guess you would have to wake up for the fuel stops, but I think I could live with that. You could do it in record time.

Having said all this, I think the most impressive journey would be the work commute. No problems worrying about traffic when you are in the back having a nap or watching something on TV.



















* ..."awesome to be pissed and have the self driving car drive you home"...
* ..."watch TV, eat, sleep, read a book, as the miles pass by"...
No actually, they did say that these vehicles do require "full awareness" while in use because some input is required for the vehicle to understand it is doing the right thing, and that is not true of being pissed , and not true in some split second emergency in some situations !
economictimes.indiatimes.com/small-biz/security-tech/technology/driverless-crashes/articleshow/56510821.cms

As a computer programmer myself i can tell you what to look for alike looking for the problem in a black box of an aircraft for information!
Sensors for example only read back what they get, and when constructed the a sensor unit and its specific job is calibrated to quality boundary of reception band.
However, sensors can go wrong as too components can change their values under particular physical environment conditions.
To make matters worse, the information is sent through what is known as a "core" of a CPU which is responsible for sorting, storing and matching data and committing operation upon that data in precedence simultaneously by which in that "scheme" one sensor is one thread (roughly) , and as management of threads in prioritization is called, to stop a thread is called an "interrupt". Prioritisation and backlogging can be a serious problem.

What they advertise in things such as driver-less buses in operation at this time is they have extremely defined routes. However one of those driver-less buses had quite a slammer "of its own failure" in the past year or two (and others that were not the driver-less vehicles fault e.g. recently in Las Vegas) !

My suggestion is that someone is sipping coffee at their same destination from same start place half way through your back breaker between Perth and Sydney !
windsolarhybridaustralia.x10.mx/PPL-light-aircraft.pdf

more... (not enough information by statistics as there are very few driver-less vehicles to commit that)
www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html

The greatest subtlety about driver-less vehicles and ordinary road accidents by human error is the immense complexity something is trying to "A.K.A." "understand and simulate in real world actions" is really too human a problem for the obscure unique events of environment change. Hazard a guess it would require some extremely human ideals of mind to "understand means to end to traverse some obstacles carefully and sensibly enough" in face of non standardized route occurring or freak weather event

(note - Believe it or not i actually know a vehicle should weigh "2 ton" ( 1814 . 37Kg ) on some good roads to not be blown off the road in good weather with a perfect good surface !!! ).

============== I do not give them the point , it would require a "super cradle computer" to properly assess the information at "time speed" required !
en.wikipedia.org/wiki/Supercomputer

NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"

one byte is (generally eight bits)

1 mega byte = 1048576 bytes

8 x 1048576 = 8388608 "bits" (one or zero) IN A MEGA-BYTE

120 Kmh = 3333.333333333333 milimeters - p/second

8Ghz (8 thousand million times per second) 8000000000

1/1000 th of a second 8000000

each "bit" (either one or zero) through the CPU instruction core is one cycle unit

THE FOLLOWING WOULD BE CLASSIFIED INFORMATION SO THESE VALUES ARE "SUPPOSE"
A "colour image" of 12 mega pixel can be said to be 12,000,000 x 3 = 36,000,000 (red,green,blue)
A radard image can be said to be grey scaled and 12,000,000 bytes
FOR IMAGING 48, MILLION BYTES OF INFORMATION

Simply to look at the picture will require buffering so to be putting it through the core will mean x 2 for "TIME"

2 x 48 million = 96,000,000 bytes

then interrupts must stop the thread to examine each eight byte set so an interrupt will come with the eigth bit making x 3 for "TIME"

but then the instruction index has to set the CPU core circuit to commit operation on the byte data using the stack and "program data"-CALLED INSTRUCTIONS that are sitting in a separate set of software byte data lcation

so we reach things logically x 6 (much more really) for time to read and operate on the data !!!


48 million x 6 = 288,000,000 bytes of DATA only total = 1728,000,000 bytes x 8 = ( 13824000000 bits (or one cycle beat))


1/1000 th of a second 8,000,000 cycles of CPU core beat

8,000,000 eight million times a "thousandth of a second"

13824000000 divided by 8,000,000

1728 sets of thousandths of a second for data

1728 / 1000 leaves sets of one second = 1.728 SECONDS


3333 (MM Per SECOND) x 1.728 = 5759 mm

================

SOMETHING LIKE .................. (approximately 5 meters traveled with an 8Ghz core CPU)

That's actually "a hell of a long way traveled" in popping the question on the console if something is wrong !!!

5.759 meters traveled (@ 120kMH) to complete the a basic SENSOR RECEPTION AND SOFTWARE PROCESSING !

================


I once figured that my hard disk over a few of hours at 5200 RPM to defrag my disk, the rim edge of the disk at 5 inch diameter travels around 700 Km to complete the task

GIANT suppose but there is some of the problem

The strategy computer scientists use to gain more CPU'S to make a virtual cradle super computer is to Network a set of computers.













I don't mean to be insulting, but are you sure you are a computer programmer? You go into a lot of detail about trivial things, but the way you evaluate information does not have to happen the way you describe.

Why would you need to have interrupts when you are processing that amount of data at speed? Seriously, I think you are not at the level required for understanding this problem.

I am sure there are a lot of advances in computer vision, and its not as simple or as complex as 'sending each bit through the core'.

I thought this was meant to be a fun thread anyway.

If I am going to have to sit there and monitor the thing, I may as well let my wife do it, and wake me when we are there. She can sleep while I sail and be refreshed for the drive back. If I drive to Perth from Sydney, how many wives do I need?











@ FormulaNova
Here is a quote from my post ..."NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"...
If you bothered to read the heading above the maths i just quoted
from my post you would not need to bother with much of your post and its question,

@ FormulaNova Second, obviously you have no idea how a computer operates , neither imaging !
I am a computer Programmer ! I have written two freeware desktop programs, one of them an imaging program !

Here is a link to the imaging programs' code in an older version than present, to see the code properly requires downloading the file and using Adobe Acrobat PDF reader to view it with zoom.
www.scribd.com/doc/270239630/Sidewinder-Photo-Colour-Balancer


(Quote FormulaNova) Why would you need to have interrupts when you are processing that amount of data at speed?

(to operate on data e.g. measure it or change it or correlate it requires going through the instruction circuits in the core - e.g. that is what that is to be a CPU , that is what happens , you cannot have a program or data without that - anything else as an "electronic computer" is impossible !!!!) One single core allows only one thread(sub program instruction set indexed on the stack) in operation , to work on other data from other sensors in threading in a CPU the process thread is stopped (interrupted) after a particular quantity of time on the thread , and another set of instructions loaded for another thread(sub program of the OS) to work on ,,and so on until all of it is completed.
That action is multi threading. Because that problem is too slow, new CPU's now carry multi cores so they do not need to stop for other sub programs run simultaneous,

Of the maths, my imaging program speed in a 24 megapixel rates around the speed and data throughput similar to the maths example.
windsolarhybridaustralia.x10.mx/LinuxSideWinderPhotoColourBalancerConfigInstal.html
With the driver-less car , its just a look at how long it takes with an ordinary high speed CPU to process a sensors information and how much crap the vehicle would be in, in an emergency situation at speed (how far it may have traveled before it figure what its obstacle IS/WAS [ non descript problem on the sensor ]). In computing that is called "matching" alike with regular expressions, condition expressions , booleans.



Whichever way you look at it, for quite some time to come it would be a good idea to "be awake and monitor" the vehicle and environment as the Tesla incident in Florida shows.







Okay, no problem. Its just the language you used seemed a bit unusual.

I think the way you view the problem is probably the way a software engineer would view it. A very serial, logical way to do it. Using general cores, this is how you would approach it, but why not use hardware that can do it much more easily?

From a hardware designers perspective, you would see it a little bit differently, and make the hardware do a lot of the hard stuff.

You can have frame buffers, so there is no need to even 'accept' the data. It will just be there. You can work on it as needed, and embed some of the smarts in the hardware. Why write code to find patterns when you can create the hardware to do it, and do it much faster and in parallel.

Modern CPU's are generic. You can achieve a lot more in the same silicon by doing things in hardware constructs. Want to multiple every number in an array by 27.5? Do it in software can take a lot of resources. Do it in hardware, and it happens much quicker.

If you have enough 'cores' you can do lots, but if you have a requirement to move data from one location to another and operate on it, with huge amounts of data, its going to be much more efficient to create a separate hardware function, or even a dedicated 'processor' to do just that,

I am sure machine vision uses a lot of parallel processing and custom logic. It just makes sense to do it that way. As an example, you don't need to see an exact image of what the car is seeing elsewhere in your program, you just need to identify things like lane markers, other cars, and people, and then pass that data upwards.

I remember back to the eighties, when computer graphics was still relatively primitive. Everything was done in software. Then someone figured out that you can move blocks of data around and manipulate them as you do so, and do it in hardware. The result was much better graphics than thought possible, with relatively slow processors. DMA has been around for ages, but to modify the data as you do so can save a lot of processing power.

I think you will see much more advancement in self-driving cars than you would expect.





(Quote FormulaNova)..."Modern CPU's are generic. You can achieve a lot more in the same silicon by doing things in hardware constructs. Want to multiple every number in an array by 27.5? Do it in software can take a lot of resources. Do it in hardware, and it happens much quicker."...

Lower level CPU are
often dedicated to such actions, they are mainly referred to as "slave computers" because they fulfill very few functions with very few instruction circuit sets in the core (you only feed them their type of meat). e.g. Calculators. More usual are software i/o calculators networked into, others receive "signals" with their data packet i/o scheme protocol, but to skip to the point the "tactic or methodology" you are pointing to there is as old as computers that ran tape and communicated across phone and radio lines such as company or institution VAX machine networks.

If you understand how radar finds the position of an aircraft by clock time cycle (send , receive), lights' speed, returned frequency heard, you would understand that the feature of being right in recognizing the AKA "picture" and identifying its contents is a vast job in real world, particularly if there are no lane markers, final identification of an obstacle would be as much a maze(puzzle) of information as what i'm writing here.
e.g. If it were simply a grey hoodie , is it a person or an animal if there is no EVERLAST , LONSDALE , ADDIDAS or NIKE written on it ??????

That is all very well to recognize conditions are what they should be but what happens when conditions are not is the real challenge, and as you know if you have an Australian drivers license you are not allowed to pull up for pets so not to endanger humans and others with the effect !
The place where the Tesla car fatality occurred has an airport, these driver assistance or driver-less car computers use other systems also such as radar and sonar.
Aircraft propellers do not simply emit a low droning sound, vibrations as do transmitters(oscillators that are not operating properly) all have harmonic frequencies produced (apparently there are effects with radio waves that classify similar to Doppler too) that could have interfered. The crash details were finally from Tesla the vehicles computer had recognised the trucks trailer as an overhead road sign which seems to be some sort of calibration deficiency of range and size, more than camera view deficiency (or maybe flawed assumption by the software).

(The maths was just an idea of what it would be like for a high level chip to give software the time to solve the problem).

Like Johnny in Flying High (Aeroplane) said ... What do you make of this...


A good guess is that to have driver-less cars would take a huge quantity of extra equipment and rules set up for such vehicles aside to normal present traffic systems !
I believe it is something that can never be achieved SAFELY by the individual equipment alone !
It is simply a very underhandedly hammed up tout to everyone (not saying they cannot do quite immensely with slightly lesser systems just do not say it isn't suffering the same anyhow as a driver e.g. on long road trips hour after hour).

E.G. Automatic policing ( police-less policing ? )



As you can see these types of technology have been attempted before with much the same result as the Tesla fatality ....

hilly
WA, 7323 posts
2 Dec 2017 7:47PM
Thumbs Up

Select to expand quote
nicephotog said..

FormulaNova said..





nicephotog said..







FormulaNova said..











nicephotog said..












FormulaNova said..
Seriously, everyone must be thinking it would be awesome to be pissed and have the self driving car drive you home.

Also on my favorites would be "driving" to Perth from Sydney. Not much of a drama really when you can watch TV, eat, sleep, read a book, as the miles pass by. I guess you would have to wake up for the fuel stops, but I think I could live with that. You could do it in record time.

Having said all this, I think the most impressive journey would be the work commute. No problems worrying about traffic when you are in the back having a nap or watching something on TV.




















* ..."awesome to be pissed and have the self driving car drive you home"...
* ..."watch TV, eat, sleep, read a book, as the miles pass by"...
No actually, they did say that these vehicles do require "full awareness" while in use because some input is required for the vehicle to understand it is doing the right thing, and that is not true of being pissed , and not true in some split second emergency in some situations !
economictimes.indiatimes.com/small-biz/security-tech/technology/driverless-crashes/articleshow/56510821.cms

As a computer programmer myself i can tell you what to look for alike looking for the problem in a black box of an aircraft for information!
Sensors for example only read back what they get, and when constructed the a sensor unit and its specific job is calibrated to quality boundary of reception band.
However, sensors can go wrong as too components can change their values under particular physical environment conditions.
To make matters worse, the information is sent through what is known as a "core" of a CPU which is responsible for sorting, storing and matching data and committing operation upon that data in precedence simultaneously by which in that "scheme" one sensor is one thread (roughly) , and as management of threads in prioritization is called, to stop a thread is called an "interrupt". Prioritisation and backlogging can be a serious problem.

What they advertise in things such as driver-less buses in operation at this time is they have extremely defined routes. However one of those driver-less buses had quite a slammer "of its own failure" in the past year or two (and others that were not the driver-less vehicles fault e.g. recently in Las Vegas) !

My suggestion is that someone is sipping coffee at their same destination from same start place half way through your back breaker between Perth and Sydney !
windsolarhybridaustralia.x10.mx/PPL-light-aircraft.pdf

more... (not enough information by statistics as there are very few driver-less vehicles to commit that)
www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html

The greatest subtlety about driver-less vehicles and ordinary road accidents by human error is the immense complexity something is trying to "A.K.A." "understand and simulate in real world actions" is really too human a problem for the obscure unique events of environment change. Hazard a guess it would require some extremely human ideals of mind to "understand means to end to traverse some obstacles carefully and sensibly enough" in face of non standardized route occurring or freak weather event

(note - Believe it or not i actually know a vehicle should weigh "2 ton" ( 1814 . 37Kg ) on some good roads to not be blown off the road in good weather with a perfect good surface !!! ).

============== I do not give them the point , it would require a "super cradle computer" to properly assess the information at "time speed" required !
en.wikipedia.org/wiki/Supercomputer

NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"

one byte is (generally eight bits)

1 mega byte = 1048576 bytes

8 x 1048576 = 8388608 "bits" (one or zero) IN A MEGA-BYTE

120 Kmh = 3333.333333333333 milimeters - p/second

8Ghz (8 thousand million times per second) 8000000000

1/1000 th of a second 8000000

each "bit" (either one or zero) through the CPU instruction core is one cycle unit

THE FOLLOWING WOULD BE CLASSIFIED INFORMATION SO THESE VALUES ARE "SUPPOSE"
A "colour image" of 12 mega pixel can be said to be 12,000,000 x 3 = 36,000,000 (red,green,blue)
A radard image can be said to be grey scaled and 12,000,000 bytes
FOR IMAGING 48, MILLION BYTES OF INFORMATION

Simply to look at the picture will require buffering so to be putting it through the core will mean x 2 for "TIME"

2 x 48 million = 96,000,000 bytes

then interrupts must stop the thread to examine each eight byte set so an interrupt will come with the eigth bit making x 3 for "TIME"

but then the instruction index has to set the CPU core circuit to commit operation on the byte data using the stack and "program data"-CALLED INSTRUCTIONS that are sitting in a separate set of software byte data lcation

so we reach things logically x 6 (much more really) for time to read and operate on the data !!!


48 million x 6 = 288,000,000 bytes of DATA only total = 1728,000,000 bytes x 8 = ( 13824000000 bits (or one cycle beat))


1/1000 th of a second 8,000,000 cycles of CPU core beat

8,000,000 eight million times a "thousandth of a second"

13824000000 divided by 8,000,000

1728 sets of thousandths of a second for data

1728 / 1000 leaves sets of one second = 1.728 SECONDS


3333 (MM Per SECOND) x 1.728 = 5759 mm

================

SOMETHING LIKE .................. (approximately 5 meters traveled with an 8Ghz core CPU)

That's actually "a hell of a long way traveled" in popping the question on the console if something is wrong !!!

5.759 meters traveled (@ 120kMH) to complete the a basic SENSOR RECEPTION AND SOFTWARE PROCESSING !

================


I once figured that my hard disk over a few of hours at 5200 RPM to defrag my disk, the rim edge of the disk at 5 inch diameter travels around 700 Km to complete the task

GIANT suppose but there is some of the problem

The strategy computer scientists use to gain more CPU'S to make a virtual cradle super computer is to Network a set of computers.














I don't mean to be insulting, but are you sure you are a computer programmer? You go into a lot of detail about trivial things, but the way you evaluate information does not have to happen the way you describe.

Why would you need to have interrupts when you are processing that amount of data at speed? Seriously, I think you are not at the level required for understanding this problem.

I am sure there are a lot of advances in computer vision, and its not as simple or as complex as 'sending each bit through the core'.

I thought this was meant to be a fun thread anyway.

If I am going to have to sit there and monitor the thing, I may as well let my wife do it, and wake me when we are there. She can sleep while I sail and be refreshed for the drive back. If I drive to Perth from Sydney, how many wives do I need?












@ FormulaNova
Here is a quote from my post ..."NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"...
If you bothered to read the heading above the maths i just quoted
from my post you would not need to bother with much of your post and its question,

@ FormulaNova Second, obviously you have no idea how a computer operates , neither imaging !
I am a computer Programmer ! I have written two freeware desktop programs, one of them an imaging program !

Here is a link to the imaging programs' code in an older version than present, to see the code properly requires downloading the file and using Adobe Acrobat PDF reader to view it with zoom.
www.scribd.com/doc/270239630/Sidewinder-Photo-Colour-Balancer


(Quote FormulaNova) Why would you need to have interrupts when you are processing that amount of data at speed?

(to operate on data e.g. measure it or change it or correlate it requires going through the instruction circuits in the core - e.g. that is what that is to be a CPU , that is what happens , you cannot have a program or data without that - anything else as an "electronic computer" is impossible !!!!) One single core allows only one thread(sub program instruction set indexed on the stack) in operation , to work on other data from other sensors in threading in a CPU the process thread is stopped (interrupted) after a particular quantity of time on the thread , and another set of instructions loaded for another thread(sub program of the OS) to work on ,,and so on until all of it is completed.
That action is multi threading. Because that problem is too slow, new CPU's now carry multi cores so they do not need to stop for other sub programs run simultaneous,

Of the maths, my imaging program speed in a 24 megapixel rates around the speed and data throughput similar to the maths example.
windsolarhybridaustralia.x10.mx/LinuxSideWinderPhotoColourBalancerConfigInstal.html
With the driver-less car , its just a look at how long it takes with an ordinary high speed CPU to process a sensors information and how much crap the vehicle would be in, in an emergency situation at speed (how far it may have traveled before it figure what its obstacle IS/WAS [ non descript problem on the sensor ]). In computing that is called "matching" alike with regular expressions, condition expressions , booleans.



Whichever way you look at it, for quite some time to come it would be a good idea to "be awake and monitor" the vehicle and environment as the Tesla incident in Florida shows.








Okay, no problem. Its just the language you used seemed a bit unusual.

I think the way you view the problem is probably the way a software engineer would view it. A very serial, logical way to do it. Using general cores, this is how you would approach it, but why not use hardware that can do it much more easily?

From a hardware designers perspective, you would see it a little bit differently, and make the hardware do a lot of the hard stuff.

You can have frame buffers, so there is no need to even 'accept' the data. It will just be there. You can work on it as needed, and embed some of the smarts in the hardware. Why write code to find patterns when you can create the hardware to do it, and do it much faster and in parallel.

Modern CPU's are generic. You can achieve a lot more in the same silicon by doing things in hardware constructs. Want to multiple every number in an array by 27.5? Do it in software can take a lot of resources. Do it in hardware, and it happens much quicker.

If you have enough 'cores' you can do lots, but if you have a requirement to move data from one location to another and operate on it, with huge amounts of data, its going to be much more efficient to create a separate hardware function, or even a dedicated 'processor' to do just that,

I am sure machine vision uses a lot of parallel processing and custom logic. It just makes sense to do it that way. As an example, you don't need to see an exact image of what the car is seeing elsewhere in your program, you just need to identify things like lane markers, other cars, and people, and then pass that data upwards.

I remember back to the eighties, when computer graphics was still relatively primitive. Everything was done in software. Then someone figured out that you can move blocks of data around and manipulate them as you do so, and do it in hardware. The result was much better graphics than thought possible, with relatively slow processors. DMA has been around for ages, but to modify the data as you do so can save a lot of processing power.

I think you will see much more advancement in self-driving cars than you would expect.






(Quote FormulaNova)..."Modern CPU's are generic. You can achieve a lot more in the same silicon by doing things in hardware constructs. Want to multiple every number in an array by 27.5? Do it in software can take a lot of resources. Do it in hardware, and it happens much quicker."...

Lower level CPU are
often dedicated to such actions, they are mainly referred to as "slave computers" because they fulfill very few functions with very few instruction circuit sets in the core (you only feed them their type of meat). e.g. Calculators. More usual are software i/o calculators networked into, others receive "signals" with their data packet i/o scheme protocol, but to skip to the point the "tactic or methodology" you are pointing to there is as old as computers that ran tape and communicated across phone and radio lines such as company or institution VAX machine networks.

If you understand how radar finds the position of an aircraft by clock time cycle (send , receive), lights' speed, returned frequency heard, you would understand that the feature of being right in recognizing the AKA "picture" and identifying its contents is a vast job in real world, particularly if there are no lane markers, final identification of an obstacle would be as much a maze(puzzle) of information as what i'm writing here.
e.g. If it were simply a grey hoodie , is it a person or an animal if there is no EVERLAST , LONSDALE , ADDIDAS or NIKE written on it ??????

That is all very well to recognize conditions are what they should be but what happens when conditions are not is the real challenge, and as you know if you have an Australian drivers license you are not allowed to pull up for pets so not to endanger humans and others with the effect !
The place where the Tesla car fatality occurred has an airport, these driver assistance or driver-less car computers use other systems also such as radar and sonar.
Aircraft propellers do not simply emit a low droning sound, vibrations as do transmitters(oscillators that are not operating properly) all have harmonic frequencies produced (apparently there are effects with radio waves that classify similar to Doppler too) that could have interfered. The crash details were finally from Tesla the vehicles computer had recognised the trucks trailer as an overhead road sign which seems to be some sort of calibration deficiency of range and size, more than camera view deficiency (or maybe flawed assumption by the software).

(The maths was just an idea of what it would be like for a high level chip to give software the time to solve the problem).

Like Johnny in Flying High (Aeroplane) said ... What do you make of this...


A good guess is that to have driver-less cars would take a huge quantity of extra equipment and rules set up for such vehicles aside to normal present traffic systems !
I believe it is something that can never be achieved SAFELY by the individual equipment alone !
It is simply a very underhandedly hammed up tout to everyone (not saying they cannot do quite immensely with slightly lesser systems just do not say it isn't suffering the same anyhow as a driver e.g. on long road trips hour after hour).

E.G. Automatic policing ( police-less policing ? )



As you can see these types of technology have been attempted before with much the same result as the Tesla fatality ....



You do know they are on the road already??

nicephotog
NSW, 251 posts
2 Dec 2017 10:59PM
Thumbs Up

Select to expand quote
hilly said..

nicephotog said..


FormulaNova said..






nicephotog said..








FormulaNova said..












nicephotog said..













FormulaNova said..
Seriously, everyone must be thinking it would be awesome to be pissed and have the self driving car drive you home.

Also on my favorites would be "driving" to Perth from Sydney. Not much of a drama really when you can watch TV, eat, sleep, read a book, as the miles pass by. I guess you would have to wake up for the fuel stops, but I think I could live with that. You could do it in record time.

Having said all this, I think the most impressive journey would be the work commute. No problems worrying about traffic when you are in the back having a nap or watching something on TV.





















* ..."awesome to be pissed and have the self driving car drive you home"...
* ..."watch TV, eat, sleep, read a book, as the miles pass by"...
No actually, they did say that these vehicles do require "full awareness" while in use because some input is required for the vehicle to understand it is doing the right thing, and that is not true of being pissed , and not true in some split second emergency in some situations !
economictimes.indiatimes.com/small-biz/security-tech/technology/driverless-crashes/articleshow/56510821.cms

As a computer programmer myself i can tell you what to look for alike looking for the problem in a black box of an aircraft for information!
Sensors for example only read back what they get, and when constructed the a sensor unit and its specific job is calibrated to quality boundary of reception band.
However, sensors can go wrong as too components can change their values under particular physical environment conditions.
To make matters worse, the information is sent through what is known as a "core" of a CPU which is responsible for sorting, storing and matching data and committing operation upon that data in precedence simultaneously by which in that "scheme" one sensor is one thread (roughly) , and as management of threads in prioritization is called, to stop a thread is called an "interrupt". Prioritisation and backlogging can be a serious problem.

What they advertise in things such as driver-less buses in operation at this time is they have extremely defined routes. However one of those driver-less buses had quite a slammer "of its own failure" in the past year or two (and others that were not the driver-less vehicles fault e.g. recently in Las Vegas) !

My suggestion is that someone is sipping coffee at their same destination from same start place half way through your back breaker between Perth and Sydney !
windsolarhybridaustralia.x10.mx/PPL-light-aircraft.pdf

more... (not enough information by statistics as there are very few driver-less vehicles to commit that)
www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html

The greatest subtlety about driver-less vehicles and ordinary road accidents by human error is the immense complexity something is trying to "A.K.A." "understand and simulate in real world actions" is really too human a problem for the obscure unique events of environment change. Hazard a guess it would require some extremely human ideals of mind to "understand means to end to traverse some obstacles carefully and sensibly enough" in face of non standardized route occurring or freak weather event

(note - Believe it or not i actually know a vehicle should weigh "2 ton" ( 1814 . 37Kg ) on some good roads to not be blown off the road in good weather with a perfect good surface !!! ).

============== I do not give them the point , it would require a "super cradle computer" to properly assess the information at "time speed" required !
en.wikipedia.org/wiki/Supercomputer

NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"

one byte is (generally eight bits)

1 mega byte = 1048576 bytes

8 x 1048576 = 8388608 "bits" (one or zero) IN A MEGA-BYTE

120 Kmh = 3333.333333333333 milimeters - p/second

8Ghz (8 thousand million times per second) 8000000000

1/1000 th of a second 8000000

each "bit" (either one or zero) through the CPU instruction core is one cycle unit

THE FOLLOWING WOULD BE CLASSIFIED INFORMATION SO THESE VALUES ARE "SUPPOSE"
A "colour image" of 12 mega pixel can be said to be 12,000,000 x 3 = 36,000,000 (red,green,blue)
A radard image can be said to be grey scaled and 12,000,000 bytes
FOR IMAGING 48, MILLION BYTES OF INFORMATION

Simply to look at the picture will require buffering so to be putting it through the core will mean x 2 for "TIME"

2 x 48 million = 96,000,000 bytes

then interrupts must stop the thread to examine each eight byte set so an interrupt will come with the eigth bit making x 3 for "TIME"

but then the instruction index has to set the CPU core circuit to commit operation on the byte data using the stack and "program data"-CALLED INSTRUCTIONS that are sitting in a separate set of software byte data lcation

so we reach things logically x 6 (much more really) for time to read and operate on the data !!!


48 million x 6 = 288,000,000 bytes of DATA only total = 1728,000,000 bytes x 8 = ( 13824000000 bits (or one cycle beat))


1/1000 th of a second 8,000,000 cycles of CPU core beat

8,000,000 eight million times a "thousandth of a second"

13824000000 divided by 8,000,000

1728 sets of thousandths of a second for data

1728 / 1000 leaves sets of one second = 1.728 SECONDS


3333 (MM Per SECOND) x 1.728 = 5759 mm

================

SOMETHING LIKE .................. (approximately 5 meters traveled with an 8Ghz core CPU)

That's actually "a hell of a long way traveled" in popping the question on the console if something is wrong !!!

5.759 meters traveled (@ 120kMH) to complete the a basic SENSOR RECEPTION AND SOFTWARE PROCESSING !

================


I once figured that my hard disk over a few of hours at 5200 RPM to defrag my disk, the rim edge of the disk at 5 inch diameter travels around 700 Km to complete the task

GIANT suppose but there is some of the problem

The strategy computer scientists use to gain more CPU'S to make a virtual cradle super computer is to Network a set of computers.















I don't mean to be insulting, but are you sure you are a computer programmer? You go into a lot of detail about trivial things, but the way you evaluate information does not have to happen the way you describe.

Why would you need to have interrupts when you are processing that amount of data at speed? Seriously, I think you are not at the level required for understanding this problem.

I am sure there are a lot of advances in computer vision, and its not as simple or as complex as 'sending each bit through the core'.

I thought this was meant to be a fun thread anyway.

If I am going to have to sit there and monitor the thing, I may as well let my wife do it, and wake me when we are there. She can sleep while I sail and be refreshed for the drive back. If I drive to Perth from Sydney, how many wives do I need?













@ FormulaNova
Here is a quote from my post ..."NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"...
If you bothered to read the heading above the maths i just quoted
from my post you would not need to bother with much of your post and its question,

@ FormulaNova Second, obviously you have no idea how a computer operates , neither imaging !
I am a computer Programmer ! I have written two freeware desktop programs, one of them an imaging program !

Here is a link to the imaging programs' code in an older version than present, to see the code properly requires downloading the file and using Adobe Acrobat PDF reader to view it with zoom.
www.scribd.com/doc/270239630/Sidewinder-Photo-Colour-Balancer


(Quote FormulaNova) Why would you need to have interrupts when you are processing that amount of data at speed?

(to operate on data e.g. measure it or change it or correlate it requires going through the instruction circuits in the core - e.g. that is what that is to be a CPU , that is what happens , you cannot have a program or data without that - anything else as an "electronic computer" is impossible !!!!) One single core allows only one thread(sub program instruction set indexed on the stack) in operation , to work on other data from other sensors in threading in a CPU the process thread is stopped (interrupted) after a particular quantity of time on the thread , and another set of instructions loaded for another thread(sub program of the OS) to work on ,,and so on until all of it is completed.
That action is multi threading. Because that problem is too slow, new CPU's now carry multi cores so they do not need to stop for other sub programs run simultaneous,

Of the maths, my imaging program speed in a 24 megapixel rates around the speed and data throughput similar to the maths example.
windsolarhybridaustralia.x10.mx/LinuxSideWinderPhotoColourBalancerConfigInstal.html
With the driver-less car , its just a look at how long it takes with an ordinary high speed CPU to process a sensors information and how much crap the vehicle would be in, in an emergency situation at speed (how far it may have traveled before it figure what its obstacle IS/WAS [ non descript problem on the sensor ]). In computing that is called "matching" alike with regular expressions, condition expressions , booleans.



Whichever way you look at it, for quite some time to come it would be a good idea to "be awake and monitor" the vehicle and environment as the Tesla incident in Florida shows.









Okay, no problem. Its just the language you used seemed a bit unusual.

I think the way you view the problem is probably the way a software engineer would view it. A very serial, logical way to do it. Using general cores, this is how you would approach it, but why not use hardware that can do it much more easily?

From a hardware designers perspective, you would see it a little bit differently, and make the hardware do a lot of the hard stuff.

You can have frame buffers, so there is no need to even 'accept' the data. It will just be there. You can work on it as needed, and embed some of the smarts in the hardware. Why write code to find patterns when you can create the hardware to do it, and do it much faster and in parallel.

Modern CPU's are generic. You can achieve a lot more in the same silicon by doing things in hardware constructs. Want to multiple every number in an array by 27.5? Do it in software can take a lot of resources. Do it in hardware, and it happens much quicker.

If you have enough 'cores' you can do lots, but if you have a requirement to move data from one location to another and operate on it, with huge amounts of data, its going to be much more efficient to create a separate hardware function, or even a dedicated 'processor' to do just that,

I am sure machine vision uses a lot of parallel processing and custom logic. It just makes sense to do it that way. As an example, you don't need to see an exact image of what the car is seeing elsewhere in your program, you just need to identify things like lane markers, other cars, and people, and then pass that data upwards.

I remember back to the eighties, when computer graphics was still relatively primitive. Everything was done in software. Then someone figured out that you can move blocks of data around and manipulate them as you do so, and do it in hardware. The result was much better graphics than thought possible, with relatively slow processors. DMA has been around for ages, but to modify the data as you do so can save a lot of processing power.

I think you will see much more advancement in self-driving cars than you would expect.







(Quote FormulaNova)..."Modern CPU's are generic. You can achieve a lot more in the same silicon by doing things in hardware constructs. Want to multiple every number in an array by 27.5? Do it in software can take a lot of resources. Do it in hardware, and it happens much quicker."...

Lower level CPU are
often dedicated to such actions, they are mainly referred to as "slave computers" because they fulfill very few functions with very few instruction circuit sets in the core (you only feed them their type of meat). e.g. Calculators. More usual are software i/o calculators networked into, others receive "signals" with their data packet i/o scheme protocol, but to skip to the point the "tactic or methodology" you are pointing to there is as old as computers that ran tape and communicated across phone and radio lines such as company or institution VAX machine networks.

If you understand how radar finds the position of an aircraft by clock time cycle (send , receive), lights' speed, returned frequency heard, you would understand that the feature of being right in recognizing the AKA "picture" and identifying its contents is a vast job in real world, particularly if there are no lane markers, final identification of an obstacle would be as much a maze(puzzle) of information as what i'm writing here.
e.g. If it were simply a grey hoodie , is it a person or an animal if there is no EVERLAST , LONSDALE , ADDIDAS or NIKE written on it ??????

That is all very well to recognize conditions are what they should be but what happens when conditions are not is the real challenge, and as you know if you have an Australian drivers license you are not allowed to pull up for pets so not to endanger humans and others with the effect !
The place where the Tesla car fatality occurred has an airport, these driver assistance or driver-less car computers use other systems also such as radar and sonar.
Aircraft propellers do not simply emit a low droning sound, vibrations as do transmitters(oscillators that are not operating properly) all have harmonic frequencies produced (apparently there are effects with radio waves that classify similar to Doppler too) that could have interfered. The crash details were finally from Tesla the vehicles computer had recognised the trucks trailer as an overhead road sign which seems to be some sort of calibration deficiency of range and size, more than camera view deficiency (or maybe flawed assumption by the software).

(The maths was just an idea of what it would be like for a high level chip to give software the time to solve the problem).

Like Johnny in Flying High (Aeroplane) said ... What do you make of this...


A good guess is that to have driver-less cars would take a huge quantity of extra equipment and rules set up for such vehicles aside to normal present traffic systems !
I believe it is something that can never be achieved SAFELY by the individual equipment alone !
It is simply a very underhandedly hammed up tout to everyone (not saying they cannot do quite immensely with slightly lesser systems just do not say it isn't suffering the same anyhow as a driver e.g. on long road trips hour after hour).

E.G. Automatic policing ( police-less policing ? )



As you can see these types of technology have been attempted before with much the same result as the Tesla fatality ....




You do know they are on the road already??



Select to expand quote
hilly said..

nicephotog said..


FormulaNova said..






nicephotog said..








FormulaNova said..












nicephotog said..













FormulaNova said..
Seriously, everyone must be thinking it would be awesome to be pissed and have the self driving car drive you home.

Also on my favorites would be "driving" to Perth from Sydney. Not much of a drama really when you can watch TV, eat, sleep, read a book, as the miles pass by. I guess you would have to wake up for the fuel stops, but I think I could live with that. You could do it in record time.

Having said all this, I think the most impressive journey would be the work commute. No problems worrying about traffic when you are in the back having a nap or watching something on TV.





















* ..."awesome to be pissed and have the self driving car drive you home"...
* ..."watch TV, eat, sleep, read a book, as the miles pass by"...
No actually, they did say that these vehicles do require "full awareness" while in use because some input is required for the vehicle to understand it is doing the right thing, and that is not true of being pissed , and not true in some split second emergency in some situations !
economictimes.indiatimes.com/small-biz/security-tech/technology/driverless-crashes/articleshow/56510821.cms

As a computer programmer myself i can tell you what to look for alike looking for the problem in a black box of an aircraft for information!
Sensors for example only read back what they get, and when constructed the a sensor unit and its specific job is calibrated to quality boundary of reception band.
However, sensors can go wrong as too components can change their values under particular physical environment conditions.
To make matters worse, the information is sent through what is known as a "core" of a CPU which is responsible for sorting, storing and matching data and committing operation upon that data in precedence simultaneously by which in that "scheme" one sensor is one thread (roughly) , and as management of threads in prioritization is called, to stop a thread is called an "interrupt". Prioritisation and backlogging can be a serious problem.

What they advertise in things such as driver-less buses in operation at this time is they have extremely defined routes. However one of those driver-less buses had quite a slammer "of its own failure" in the past year or two (and others that were not the driver-less vehicles fault e.g. recently in Las Vegas) !

My suggestion is that someone is sipping coffee at their same destination from same start place half way through your back breaker between Perth and Sydney !
windsolarhybridaustralia.x10.mx/PPL-light-aircraft.pdf

more... (not enough information by statistics as there are very few driver-less vehicles to commit that)
www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html

The greatest subtlety about driver-less vehicles and ordinary road accidents by human error is the immense complexity something is trying to "A.K.A." "understand and simulate in real world actions" is really too human a problem for the obscure unique events of environment change. Hazard a guess it would require some extremely human ideals of mind to "understand means to end to traverse some obstacles carefully and sensibly enough" in face of non standardized route occurring or freak weather event

(note - Believe it or not i actually know a vehicle should weigh "2 ton" ( 1814 . 37Kg ) on some good roads to not be blown off the road in good weather with a perfect good surface !!! ).

============== I do not give them the point , it would require a "super cradle computer" to properly assess the information at "time speed" required !
en.wikipedia.org/wiki/Supercomputer

NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"

one byte is (generally eight bits)

1 mega byte = 1048576 bytes

8 x 1048576 = 8388608 "bits" (one or zero) IN A MEGA-BYTE

120 Kmh = 3333.333333333333 milimeters - p/second

8Ghz (8 thousand million times per second) 8000000000

1/1000 th of a second 8000000

each "bit" (either one or zero) through the CPU instruction core is one cycle unit

THE FOLLOWING WOULD BE CLASSIFIED INFORMATION SO THESE VALUES ARE "SUPPOSE"
A "colour image" of 12 mega pixel can be said to be 12,000,000 x 3 = 36,000,000 (red,green,blue)
A radard image can be said to be grey scaled and 12,000,000 bytes
FOR IMAGING 48, MILLION BYTES OF INFORMATION

Simply to look at the picture will require buffering so to be putting it through the core will mean x 2 for "TIME"

2 x 48 million = 96,000,000 bytes

then interrupts must stop the thread to examine each eight byte set so an interrupt will come with the eigth bit making x 3 for "TIME"

but then the instruction index has to set the CPU core circuit to commit operation on the byte data using the stack and "program data"-CALLED INSTRUCTIONS that are sitting in a separate set of software byte data lcation

so we reach things logically x 6 (much more really) for time to read and operate on the data !!!


48 million x 6 = 288,000,000 bytes of DATA only total = 1728,000,000 bytes x 8 = ( 13824000000 bits (or one cycle beat))


1/1000 th of a second 8,000,000 cycles of CPU core beat

8,000,000 eight million times a "thousandth of a second"

13824000000 divided by 8,000,000

1728 sets of thousandths of a second for data

1728 / 1000 leaves sets of one second = 1.728 SECONDS


3333 (MM Per SECOND) x 1.728 = 5759 mm

================

SOMETHING LIKE .................. (approximately 5 meters traveled with an 8Ghz core CPU)

That's actually "a hell of a long way traveled" in popping the question on the console if something is wrong !!!

5.759 meters traveled (@ 120kMH) to complete the a basic SENSOR RECEPTION AND SOFTWARE PROCESSING !

================


I once figured that my hard disk over a few of hours at 5200 RPM to defrag my disk, the rim edge of the disk at 5 inch diameter travels around 700 Km to complete the task

GIANT suppose but there is some of the problem

The strategy computer scientists use to gain more CPU'S to make a virtual cradle super computer is to Network a set of computers.















I don't mean to be insulting, but are you sure you are a computer programmer? You go into a lot of detail about trivial things, but the way you evaluate information does not have to happen the way you describe.

Why would you need to have interrupts when you are processing that amount of data at speed? Seriously, I think you are not at the level required for understanding this problem.

I am sure there are a lot of advances in computer vision, and its not as simple or as complex as 'sending each bit through the core'.

I thought this was meant to be a fun thread anyway.

If I am going to have to sit there and monitor the thing, I may as well let my wife do it, and wake me when we are there. She can sleep while I sail and be refreshed for the drive back. If I drive to Perth from Sydney, how many wives do I need?













@ FormulaNova
Here is a quote from my post ..."NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"...
If you bothered to read the heading above the maths i just quoted
from my post you would not need to bother with much of your post and its question,

@ FormulaNova Second, obviously you have no idea how a computer operates , neither imaging !
I am a computer Programmer ! I have written two freeware desktop programs, one of them an imaging program !

Here is a link to the imaging programs' code in an older version than present, to see the code properly requires downloading the file and using Adobe Acrobat PDF reader to view it with zoom.
www.scribd.com/doc/270239630/Sidewinder-Photo-Colour-Balancer


(Quote FormulaNova) Why would you need to have interrupts when you are processing that amount of data at speed?

(to operate on data e.g. measure it or change it or correlate it requires going through the instruction circuits in the core - e.g. that is what that is to be a CPU , that is what happens , you cannot have a program or data without that - anything else as an "electronic computer" is impossible !!!!) One single core allows only one thread(sub program instruction set indexed on the stack) in operation , to work on other data from other sensors in threading in a CPU the process thread is stopped (interrupted) after a particular quantity of time on the thread , and another set of instructions loaded for another thread(sub program of the OS) to work on ,,and so on until all of it is completed.
That action is multi threading. Because that problem is too slow, new CPU's now carry multi cores so they do not need to stop for other sub programs run simultaneous,

Of the maths, my imaging program speed in a 24 megapixel rates around the speed and data throughput similar to the maths example.
windsolarhybridaustralia.x10.mx/LinuxSideWinderPhotoColourBalancerConfigInstal.html
With the driver-less car , its just a look at how long it takes with an ordinary high speed CPU to process a sensors information and how much crap the vehicle would be in, in an emergency situation at speed (how far it may have traveled before it figure what its obstacle IS/WAS [ non descript problem on the sensor ]). In computing that is called "matching" alike with regular expressions, condition expressions , booleans.



Whichever way you look at it, for quite some time to come it would be a good idea to "be awake and monitor" the vehicle and environment as the Tesla incident in Florida shows.









Okay, no problem. Its just the language you used seemed a bit unusual.

I think the way you view the problem is probably the way a software engineer would view it. A very serial, logical way to do it. Using general cores, this is how you would approach it, but why not use hardware that can do it much more easily?

From a hardware designers perspective, you would see it a little bit differently, and make the hardware do a lot of the hard stuff.

You can have frame buffers, so there is no need to even 'accept' the data. It will just be there. You can work on it as needed, and embed some of the smarts in the hardware. Why write code to find patterns when you can create the hardware to do it, and do it much faster and in parallel.

Modern CPU's are generic. You can achieve a lot more in the same silicon by doing things in hardware constructs. Want to multiple every number in an array by 27.5? Do it in software can take a lot of resources. Do it in hardware, and it happens much quicker.

If you have enough 'cores' you can do lots, but if you have a requirement to move data from one location to another and operate on it, with huge amounts of data, its going to be much more efficient to create a separate hardware function, or even a dedicated 'processor' to do just that,

I am sure machine vision uses a lot of parallel processing and custom logic. It just makes sense to do it that way. As an example, you don't need to see an exact image of what the car is seeing elsewhere in your program, you just need to identify things like lane markers, other cars, and people, and then pass that data upwards.

I remember back to the eighties, when computer graphics was still relatively primitive. Everything was done in software. Then someone figured out that you can move blocks of data around and manipulate them as you do so, and do it in hardware. The result was much better graphics than thought possible, with relatively slow processors. DMA has been around for ages, but to modify the data as you do so can save a lot of processing power.

I think you will see much more advancement in self-driving cars than you would expect.







(Quote FormulaNova)..."Modern CPU's are generic. You can achieve a lot more in the same silicon by doing things in hardware constructs. Want to multiple every number in an array by 27.5? Do it in software can take a lot of resources. Do it in hardware, and it happens much quicker."...

Lower level CPU are
often dedicated to such actions, they are mainly referred to as "slave computers" because they fulfill very few functions with very few instruction circuit sets in the core (you only feed them their type of meat). e.g. Calculators. More usual are software i/o calculators networked into, others receive "signals" with their data packet i/o scheme protocol, but to skip to the point the "tactic or methodology" you are pointing to there is as old as computers that ran tape and communicated across phone and radio lines such as company or institution VAX machine networks.

If you understand how radar finds the position of an aircraft by clock time cycle (send , receive), lights' speed, returned frequency heard, you would understand that the feature of being right in recognizing the AKA "picture" and identifying its contents is a vast job in real world, particularly if there are no lane markers, final identification of an obstacle would be as much a maze(puzzle) of information as what i'm writing here.
e.g. If it were simply a grey hoodie , is it a person or an animal if there is no EVERLAST , LONSDALE , ADDIDAS or NIKE written on it ??????

That is all very well to recognize conditions are what they should be but what happens when conditions are not is the real challenge, and as you know if you have an Australian drivers license you are not allowed to pull up for pets so not to endanger humans and others with the effect !
The place where the Tesla car fatality occurred has an airport, these driver assistance or driver-less car computers use other systems also such as radar and sonar.
Aircraft propellers do not simply emit a low droning sound, vibrations as do transmitters(oscillators that are not operating properly) all have harmonic frequencies produced (apparently there are effects with radio waves that classify similar to Doppler too) that could have interfered. The crash details were finally from Tesla the vehicles computer had recognised the trucks trailer as an overhead road sign which seems to be some sort of calibration deficiency of range and size, more than camera view deficiency (or maybe flawed assumption by the software).

(The maths was just an idea of what it would be like for a high level chip to give software the time to solve the problem).

Like Johnny in Flying High (Aeroplane) said ... What do you make of this...


A good guess is that to have driver-less cars would take a huge quantity of extra equipment and rules set up for such vehicles aside to normal present traffic systems !
I believe it is something that can never be achieved SAFELY by the individual equipment alone !
It is simply a very underhandedly hammed up tout to everyone (not saying they cannot do quite immensely with slightly lesser systems just do not say it isn't suffering the same anyhow as a driver e.g. on long road trips hour after hour).

E.G. Automatic policing ( police-less policing ? )



As you can see these types of technology have been attempted before with much the same result as the Tesla fatality ....




You do know they are on the road already??



There are at least two where i get around, one likes to park itself!

hilly
WA, 7323 posts
2 Dec 2017 8:47PM
Thumbs Up

nicephotog said..

hilly said..


nicephotog said..



FormulaNova said..







nicephotog said..









FormulaNova said..













nicephotog said..














FormulaNova said..
Seriously, everyone must be thinking it would be awesome to be pissed and have the self driving car drive you home.

Also on my favorites would be "driving" to Perth from Sydney. Not much of a drama really when you can watch TV, eat, sleep, read a book, as the miles pass by. I guess you would have to wake up for the fuel stops, but I think I could live with that. You could do it in record time.

Having said all this, I think the most impressive journey would be the work commute. No problems worrying about traffic when you are in the back having a nap or watching something on TV.






















* ..."awesome to be pissed and have the self driving car drive you home"...
* ..."watch TV, eat, sleep, read a book, as the miles pass by"...
No actually, they did say that these vehicles do require "full awareness" while in use because some input is required for the vehicle to understand it is doing the right thing, and that is not true of being pissed , and not true in some split second emergency in some situations !
economictimes.indiatimes.com/small-biz/security-tech/technology/driverless-crashes/articleshow/56510821.cms

As a computer programmer myself i can tell you what to look for alike looking for the problem in a black box of an aircraft for information!
Sensors for example only read back what they get, and when constructed the a sensor unit and its specific job is calibrated to quality boundary of reception band.
However, sensors can go wrong as too components can change their values under particular physical environment conditions.
To make matters worse, the information is sent through what is known as a "core" of a CPU which is responsible for sorting, storing and matching data and committing operation upon that data in precedence simultaneously by which in that "scheme" one sensor is one thread (roughly) , and as management of threads in prioritization is called, to stop a thread is called an "interrupt". Prioritisation and backlogging can be a serious problem.

What they advertise in things such as driver-less buses in operation at this time is they have extremely defined routes. However one of those driver-less buses had quite a slammer "of its own failure" in the past year or two (and others that were not the driver-less vehicles fault e.g. recently in Las Vegas) !

My suggestion is that someone is sipping coffee at their same destination from same start place half way through your back breaker between Perth and Sydney !
windsolarhybridaustralia.x10.mx/PPL-light-aircraft.pdf

more... (not enough information by statistics as there are very few driver-less vehicles to commit that)
www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html

The greatest subtlety about driver-less vehicles and ordinary road accidents by human error is the immense complexity something is trying to "A.K.A." "understand and simulate in real world actions" is really too human a problem for the obscure unique events of environment change. Hazard a guess it would require some extremely human ideals of mind to "understand means to end to traverse some obstacles carefully and sensibly enough" in face of non standardized route occurring or freak weather event

(note - Believe it or not i actually know a vehicle should weigh "2 ton" ( 1814 . 37Kg ) on some good roads to not be blown off the road in good weather with a perfect good surface !!! ).

============== I do not give them the point , it would require a "super cradle computer" to properly assess the information at "time speed" required !
en.wikipedia.org/wiki/Supercomputer

NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"

one byte is (generally eight bits)

1 mega byte = 1048576 bytes

8 x 1048576 = 8388608 "bits" (one or zero) IN A MEGA-BYTE

120 Kmh = 3333.333333333333 milimeters - p/second

8Ghz (8 thousand million times per second) 8000000000

1/1000 th of a second 8000000

each "bit" (either one or zero) through the CPU instruction core is one cycle unit

THE FOLLOWING WOULD BE CLASSIFIED INFORMATION SO THESE VALUES ARE "SUPPOSE"
A "colour image" of 12 mega pixel can be said to be 12,000,000 x 3 = 36,000,000 (red,green,blue)
A radard image can be said to be grey scaled and 12,000,000 bytes
FOR IMAGING 48, MILLION BYTES OF INFORMATION

Simply to look at the picture will require buffering so to be putting it through the core will mean x 2 for "TIME"

2 x 48 million = 96,000,000 bytes

then interrupts must stop the thread to examine each eight byte set so an interrupt will come with the eigth bit making x 3 for "TIME"

but then the instruction index has to set the CPU core circuit to commit operation on the byte data using the stack and "program data"-CALLED INSTRUCTIONS that are sitting in a separate set of software byte data lcation

so we reach things logically x 6 (much more really) for time to read and operate on the data !!!


48 million x 6 = 288,000,000 bytes of DATA only total = 1728,000,000 bytes x 8 = ( 13824000000 bits (or one cycle beat))


1/1000 th of a second 8,000,000 cycles of CPU core beat

8,000,000 eight million times a "thousandth of a second"

13824000000 divided by 8,000,000

1728 sets of thousandths of a second for data

1728 / 1000 leaves sets of one second = 1.728 SECONDS


3333 (MM Per SECOND) x 1.728 = 5759 mm

================

SOMETHING LIKE .................. (approximately 5 meters traveled with an 8Ghz core CPU)

That's actually "a hell of a long way traveled" in popping the question on the console if something is wrong !!!

5.759 meters traveled (@ 120kMH) to complete the a basic SENSOR RECEPTION AND SOFTWARE PROCESSING !

================


I once figured that my hard disk over a few of hours at 5200 RPM to defrag my disk, the rim edge of the disk at 5 inch diameter travels around 700 Km to complete the task

GIANT suppose but there is some of the problem

The strategy computer scientists use to gain more CPU'S to make a virtual cradle super computer is to Network a set of computers.
















I don't mean to be insulting, but are you sure you are a computer programmer? You go into a lot of detail about trivial things, but the way you evaluate information does not have to happen the way you describe.

Why would you need to have interrupts when you are processing that amount of data at speed? Seriously, I think you are not at the level required for understanding this problem.

I am sure there are a lot of advances in computer vision, and its not as simple or as complex as 'sending each bit through the core'.

I thought this was meant to be a fun thread anyway.

If I am going to have to sit there and monitor the thing, I may as well let my wife do it, and wake me when we are there. She can sleep while I sail and be refreshed for the drive back. If I drive to Perth from Sydney, how many wives do I need?














@ FormulaNova
Here is a quote from my post ..."NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"...
If you bothered to read the heading above the maths i just quoted
from my post you would not need to bother with much of your post and its question,

@ FormulaNova Second, obviously you have no idea how a computer operates , neither imaging !
I am a computer Programmer ! I have written two freeware desktop programs, one of them an imaging program !

Here is a link to the imaging programs' code in an older version than present, to see the code properly requires downloading the file and using Adobe Acrobat PDF reader to view it with zoom.
www.scribd.com/doc/270239630/Sidewinder-Photo-Colour-Balancer


(Quote FormulaNova) Why would you need to have interrupts when you are processing that amount of data at speed?

(to operate on data e.g. measure it or change it or correlate it requires going through the instruction circuits in the core - e.g. that is what that is to be a CPU , that is what happens , you cannot have a program or data without that - anything else as an "electronic computer" is impossible !!!!) One single core allows only one thread(sub program instruction set indexed on the stack) in operation , to work on other data from other sensors in threading in a CPU the process thread is stopped (interrupted) after a particular quantity of time on the thread , and another set of instructions loaded for another thread(sub program of the OS) to work on ,,and so on until all of it is completed.
That action is multi threading. Because that problem is too slow, new CPU's now carry multi cores so they do not need to stop for other sub programs run simultaneous,

Of the maths, my imaging program speed in a 24 megapixel rates around the speed and data throughput similar to the maths example.
windsolarhybridaustralia.x10.mx/LinuxSideWinderPhotoColourBalancerConfigInstal.html
With the driver-less car , its just a look at how long it takes with an ordinary high speed CPU to process a sensors information and how much crap the vehicle would be in, in an emergency situation at speed (how far it may have traveled before it figure what its obstacle IS/WAS [ non descript problem on the sensor ]). In computing that is called "matching" alike with regular expressions, condition expressions , booleans.



Whichever way you look at it, for quite some time to come it would be a good idea to "be awake and monitor" the vehicle and environment as the Tesla incident in Florida shows.










Okay, no problem. Its just the language you used seemed a bit unusual.

I think the way you view the problem is probably the way a software engineer would view it. A very serial, logical way to do it. Using general cores, this is how you would approach it, but why not use hardware that can do it much more easily?

From a hardware designers perspective, you would see it a little bit differently, and make the hardware do a lot of the hard stuff.

You can have frame buffers, so there is no need to even 'accept' the data. It will just be there. You can work on it as needed, and embed some of the smarts in the hardware. Why write code to find patterns when you can create the hardware to do it, and do it much faster and in parallel.

Modern CPU's are generic. You can achieve a lot more in the same silicon by doing things in hardware constructs. Want to multiple every number in an array by 27.5? Do it in software can take a lot of resources. Do it in hardware, and it happens much quicker.

If you have enough 'cores' you can do lots, but if you have a requirement to move data from one location to another and operate on it, with huge amounts of data, its going to be much more efficient to create a separate hardware function, or even a dedicated 'processor' to do just that,

I am sure machine vision uses a lot of parallel processing and custom logic. It just makes sense to do it that way. As an example, you don't need to see an exact image of what the car is seeing elsewhere in your program, you just need to identify things like lane markers, other cars, and people, and then pass that data upwards.

I remember back to the eighties, when computer graphics was still relatively primitive. Everything was done in software. Then someone figured out that you can move blocks of data around and manipulate them as you do so, and do it in hardware. The result was much better graphics than thought possible, with relatively slow processors. DMA has been around for ages, but to modify the data as you do so can save a lot of processing power.

I think you will see much more advancement in self-driving cars than you would expect.








(Quote FormulaNova)..."Modern CPU's are generic. You can achieve a lot more in the same silicon by doing things in hardware constructs. Want to multiple every number in an array by 27.5? Do it in software can take a lot of resources. Do it in hardware, and it happens much quicker."...

Lower level CPU are
often dedicated to such actions, they are mainly referred to as "slave computers" because they fulfill very few functions with very few instruction circuit sets in the core (you only feed them their type of meat). e.g. Calculators. More usual are software i/o calculators networked into, others receive "signals" with their data packet i/o scheme protocol, but to skip to the point the "tactic or methodology" you are pointing to there is as old as computers that ran tape and communicated across phone and radio lines such as company or institution VAX machine networks.

If you understand how radar finds the position of an aircraft by clock time cycle (send , receive), lights' speed, returned frequency heard, you would understand that the feature of being right in recognizing the AKA "picture" and identifying its contents is a vast job in real world, particularly if there are no lane markers, final identification of an obstacle would be as much a maze(puzzle) of information as what i'm writing here.
e.g. If it were simply a grey hoodie , is it a person or an animal if there is no EVERLAST , LONSDALE , ADDIDAS or NIKE written on it ??????

That is all very well to recognize conditions are what they should be but what happens when conditions are not is the real challenge, and as you know if you have an Australian drivers license you are not allowed to pull up for pets so not to endanger humans and others with the effect !
The place where the Tesla car fatality occurred has an airport, these driver assistance or driver-less car computers use other systems also such as radar and sonar.
Aircraft propellers do not simply emit a low droning sound, vibrations as do transmitters(oscillators that are not operating properly) all have harmonic frequencies produced (apparently there are effects with radio waves that classify similar to Doppler too) that could have interfered. The crash details were finally from Tesla the vehicles computer had recognised the trucks trailer as an overhead road sign which seems to be some sort of calibration deficiency of range and size, more than camera view deficiency (or maybe flawed assumption by the software).

(The maths was just an idea of what it would be like for a high level chip to give software the time to solve the problem).

Like Johnny in Flying High (Aeroplane) said ... What do you make of this...


A good guess is that to have driver-less cars would take a huge quantity of extra equipment and rules set up for such vehicles aside to normal present traffic systems !
I believe it is something that can never be achieved SAFELY by the individual equipment alone !
It is simply a very underhandedly hammed up tout to everyone (not saying they cannot do quite immensely with slightly lesser systems just do not say it isn't suffering the same anyhow as a driver e.g. on long road trips hour after hour).

E.G. Automatic policing ( police-less policing ? )



As you can see these types of technology have been attempted before with much the same result as the Tesla fatality ....





You do know they are on the road already??





hilly said..


nicephotog said..



FormulaNova said..







nicephotog said..









FormulaNova said..













nicephotog said..














FormulaNova said..
Seriously, everyone must be thinking it would be awesome to be pissed and have the self driving car drive you home.

Also on my favorites would be "driving" to Perth from Sydney. Not much of a drama really when you can watch TV, eat, sleep, read a book, as the miles pass by. I guess you would have to wake up for the fuel stops, but I think I could live with that. You could do it in record time.

Having said all this, I think the most impressive journey would be the work commute. No problems worrying about traffic when you are in the back having a nap or watching something on TV.






















* ..."awesome to be pissed and have the self driving car drive you home"...
* ..."watch TV, eat, sleep, read a book, as the miles pass by"...
No actually, they did say that these vehicles do require "full awareness" while in use because some input is required for the vehicle to understand it is doing the right thing, and that is not true of being pissed , and not true in some split second emergency in some situations !
economictimes.indiatimes.com/small-biz/security-tech/technology/driverless-crashes/articleshow/56510821.cms

As a computer programmer myself i can tell you what to look for alike looking for the problem in a black box of an aircraft for information!
Sensors for example only read back what they get, and when constructed the a sensor unit and its specific job is calibrated to quality boundary of reception band.
However, sensors can go wrong as too components can change their values under particular physical environment conditions.
To make matters worse, the information is sent through what is known as a "core" of a CPU which is responsible for sorting, storing and matching data and committing operation upon that data in precedence simultaneously by which in that "scheme" one sensor is one thread (roughly) , and as management of threads in prioritization is called, to stop a thread is called an "interrupt". Prioritisation and backlogging can be a serious problem.

What they advertise in things such as driver-less buses in operation at this time is they have extremely defined routes. However one of those driver-less buses had quite a slammer "of its own failure" in the past year or two (and others that were not the driver-less vehicles fault e.g. recently in Las Vegas) !

My suggestion is that someone is sipping coffee at their same destination from same start place half way through your back breaker between Perth and Sydney !
windsolarhybridaustralia.x10.mx/PPL-light-aircraft.pdf

more... (not enough information by statistics as there are very few driver-less vehicles to commit that)
www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html

The greatest subtlety about driver-less vehicles and ordinary road accidents by human error is the immense complexity something is trying to "A.K.A." "understand and simulate in real world actions" is really too human a problem for the obscure unique events of environment change. Hazard a guess it would require some extremely human ideals of mind to "understand means to end to traverse some obstacles carefully and sensibly enough" in face of non standardized route occurring or freak weather event

(note - Believe it or not i actually know a vehicle should weigh "2 ton" ( 1814 . 37Kg ) on some good roads to not be blown off the road in good weather with a perfect good surface !!! ).

============== I do not give them the point , it would require a "super cradle computer" to properly assess the information at "time speed" required !
en.wikipedia.org/wiki/Supercomputer

NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"

one byte is (generally eight bits)

1 mega byte = 1048576 bytes

8 x 1048576 = 8388608 "bits" (one or zero) IN A MEGA-BYTE

120 Kmh = 3333.333333333333 milimeters - p/second

8Ghz (8 thousand million times per second) 8000000000

1/1000 th of a second 8000000

each "bit" (either one or zero) through the CPU instruction core is one cycle unit

THE FOLLOWING WOULD BE CLASSIFIED INFORMATION SO THESE VALUES ARE "SUPPOSE"
A "colour image" of 12 mega pixel can be said to be 12,000,000 x 3 = 36,000,000 (red,green,blue)
A radard image can be said to be grey scaled and 12,000,000 bytes
FOR IMAGING 48, MILLION BYTES OF INFORMATION

Simply to look at the picture will require buffering so to be putting it through the core will mean x 2 for "TIME"

2 x 48 million = 96,000,000 bytes

then interrupts must stop the thread to examine each eight byte set so an interrupt will come with the eigth bit making x 3 for "TIME"

but then the instruction index has to set the CPU core circuit to commit operation on the byte data using the stack and "program data"-CALLED INSTRUCTIONS that are sitting in a separate set of software byte data lcation

so we reach things logically x 6 (much more really) for time to read and operate on the data !!!


48 million x 6 = 288,000,000 bytes of DATA only total = 1728,000,000 bytes x 8 = ( 13824000000 bits (or one cycle beat))


1/1000 th of a second 8,000,000 cycles of CPU core beat

8,000,000 eight million times a "thousandth of a second"

13824000000 divided by 8,000,000

1728 sets of thousandths of a second for data

1728 / 1000 leaves sets of one second = 1.728 SECONDS


3333 (MM Per SECOND) x 1.728 = 5759 mm

================

SOMETHING LIKE .................. (approximately 5 meters traveled with an 8Ghz core CPU)

That's actually "a hell of a long way traveled" in popping the question on the console if something is wrong !!!

5.759 meters traveled (@ 120kMH) to complete the a basic SENSOR RECEPTION AND SOFTWARE PROCESSING !

================


I once figured that my hard disk over a few of hours at 5200 RPM to defrag my disk, the rim edge of the disk at 5 inch diameter travels around 700 Km to complete the task

GIANT suppose but there is some of the problem

The strategy computer scientists use to gain more CPU'S to make a virtual cradle super computer is to Network a set of computers.
















I don't mean to be insulting, but are you sure you are a computer programmer? You go into a lot of detail about trivial things, but the way you evaluate information does not have to happen the way you describe.

Why would you need to have interrupts when you are processing that amount of data at speed? Seriously, I think you are not at the level required for understanding this problem.

I am sure there are a lot of advances in computer vision, and its not as simple or as complex as 'sending each bit through the core'.

I thought this was meant to be a fun thread anyway.

If I am going to have to sit there and monitor the thing, I may as well let my wife do it, and wake me when we are there. She can sleep while I sail and be refreshed for the drive back. If I drive to Perth from Sydney, how many wives do I need?














@ FormulaNova
Here is a quote from my post ..."NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"...
If you bothered to read the heading above the maths i just quoted
from my post you would not need to bother with much of your post and its question,

@ FormulaNova Second, obviously you have no idea how a computer operates , neither imaging !
I am a computer Programmer ! I have written two freeware desktop programs, one of them an imaging program !

Here is a link to the imaging programs' code in an older version than present, to see the code properly requires downloading the file and using Adobe Acrobat PDF reader to view it with zoom.
www.scribd.com/doc/270239630/Sidewinder-Photo-Colour-Balancer


(Quote FormulaNova) Why would you need to have interrupts when you are processing that amount of data at speed?

(to operate on data e.g. measure it or change it or correlate it requires going through the instruction circuits in the core - e.g. that is what that is to be a CPU , that is what happens , you cannot have a program or data without that - anything else as an "electronic computer" is impossible !!!!) One single core allows only one thread(sub program instruction set indexed on the stack) in operation , to work on other data from other sensors in threading in a CPU the process thread is stopped (interrupted) after a particular quantity of time on the thread , and another set of instructions loaded for another thread(sub program of the OS) to work on ,,and so on until all of it is completed.
That action is multi threading. Because that problem is too slow, new CPU's now carry multi cores so they do not need to stop for other sub programs run simultaneous,

Of the maths, my imaging program speed in a 24 megapixel rates around the speed and data throughput similar to the maths example.
windsolarhybridaustralia.x10.mx/LinuxSideWinderPhotoColourBalancerConfigInstal.html
With the driver-less car , its just a look at how long it takes with an ordinary high speed CPU to process a sensors information and how much crap the vehicle would be in, in an emergency situation at speed (how far it may have traveled before it figure what its obstacle IS/WAS [ non descript problem on the sensor ]). In computing that is called "matching" alike with regular expressions, condition expressions , booleans.



Whichever way you look at it, for quite some time to come it would be a good idea to "be awake and monitor" the vehicle and environment as the Tesla incident in Florida shows.










Okay, no problem. Its just the language you used seemed a bit unusual.

I think the way you view the problem is probably the way a software engineer would view it. A very serial, logical way to do it. Using general cores, this is how you would approach it, but why not use hardware that can do it much more easily?

From a hardware designers perspective, you would see it a little bit differently, and make the hardware do a lot of the hard stuff.

You can have frame buffers, so there is no need to even 'accept' the data. It will just be there. You can work on it as needed, and embed some of the smarts in the hardware. Why write code to find patterns when you can create the hardware to do it, and do it much faster and in parallel.

Modern CPU's are generic. You can achieve a lot more in the same silicon by doing things in hardware constructs. Want to multiple every number in an array by 27.5? Do it in software can take a lot of resources. Do it in hardware, and it happens much quicker.

If you have enough 'cores' you can do lots, but if you have a requirement to move data from one location to another and operate on it, with huge amounts of data, its going to be much more efficient to create a separate hardware function, or even a dedicated 'processor' to do just that,

I am sure machine vision uses a lot of parallel processing and custom logic. It just makes sense to do it that way. As an example, you don't need to see an exact image of what the car is seeing elsewhere in your program, you just need to identify things like lane markers, other cars, and people, and then pass that data upwards.

I remember back to the eighties, when computer graphics was still relatively primitive. Everything was done in software. Then someone figured out that you can move blocks of data around and manipulate them as you do so, and do it in hardware. The result was much better graphics than thought possible, with relatively slow processors. DMA has been around for ages, but to modify the data as you do so can save a lot of processing power.

I think you will see much more advancement in self-driving cars than you would expect.








(Quote FormulaNova)..."Modern CPU's are generic. You can achieve a lot more in the same silicon by doing things in hardware constructs. Want to multiple every number in an array by 27.5? Do it in software can take a lot of resources. Do it in hardware, and it happens much quicker."...

Lower level CPU are
often dedicated to such actions, they are mainly referred to as "slave computers" because they fulfill very few functions with very few instruction circuit sets in the core (you only feed them their type of meat). e.g. Calculators. More usual are software i/o calculators networked into, others receive "signals" with their data packet i/o scheme protocol, but to skip to the point the "tactic or methodology" you are pointing to there is as old as computers that ran tape and communicated across phone and radio lines such as company or institution VAX machine networks.

If you understand how radar finds the position of an aircraft by clock time cycle (send , receive), lights' speed, returned frequency heard, you would understand that the feature of being right in recognizing the AKA "picture" and identifying its contents is a vast job in real world, particularly if there are no lane markers, final identification of an obstacle would be as much a maze(puzzle) of information as what i'm writing here.
e.g. If it were simply a grey hoodie , is it a person or an animal if there is no EVERLAST , LONSDALE , ADDIDAS or NIKE written on it ??????

That is all very well to recognize conditions are what they should be but what happens when conditions are not is the real challenge, and as you know if you have an Australian drivers license you are not allowed to pull up for pets so not to endanger humans and others with the effect !
The place where the Tesla car fatality occurred has an airport, these driver assistance or driver-less car computers use other systems also such as radar and sonar.
Aircraft propellers do not simply emit a low droning sound, vibrations as do transmitters(oscillators that are not operating properly) all have harmonic frequencies produced (apparently there are effects with radio waves that classify similar to Doppler too) that could have interfered. The crash details were finally from Tesla the vehicles computer had recognised the trucks trailer as an overhead road sign which seems to be some sort of calibration deficiency of range and size, more than camera view deficiency (or maybe flawed assumption by the software).

(The maths was just an idea of what it would be like for a high level chip to give software the time to solve the problem).

Like Johnny in Flying High (Aeroplane) said ... What do you make of this...


A good guess is that to have driver-less cars would take a huge quantity of extra equipment and rules set up for such vehicles aside to normal present traffic systems !
I believe it is something that can never be achieved SAFELY by the individual equipment alone !
It is simply a very underhandedly hammed up tout to everyone (not saying they cannot do quite immensely with slightly lesser systems just do not say it isn't suffering the same anyhow as a driver e.g. on long road trips hour after hour).

E.G. Automatic policing ( police-less policing ? )



As you can see these types of technology have been attempted before with much the same result as the Tesla fatality ....





You do know they are on the road already??




There are at least two where i get around, one likes to park itself!


mobile.abc.net.au/news/2017-11-29/uber-style-driverless-cars-to-be-tested-in-perth-in-global-trial/9207120?pfmredir=sm

Mastbender
1972 posts
3 Dec 2017 2:00AM
Thumbs Up

I'd tip it over just to see if it could right itself.




nicephotog
NSW, 251 posts
3 Dec 2017 5:43PM
Thumbs Up

Select to expand quote
Mastbender said..
I'd tip it over just to see if it could right itself.







Tall order, there are some military test vehicles before 2000 were being developed, but for domestic market and cars its only toys or this



Of the Perth driver-less trial, 2021 seems more realistic for Level 3 or more and goes to show it is not here and now as a technology as just around every system truthfully is Level 1 and Level 2 and as usual when governments want to underhandedly push something for their benefit far heavier and beyond an internet forum mentioning price and brand it gets all the air play to everywhere over-righting all ethical concerns of any form !
At least with that we will get some statistics in the nest few years!

quikdrawMcgraw
1221 posts
4 Dec 2017 7:06AM
Thumbs Up

Id rather drive myself anyday

rod_bunny
WA, 1089 posts
4 Dec 2017 8:57AM
Thumbs Up

I've already got a driverless car.


Every time I drive to the pub my other half drives me home.

evlPanda
NSW, 9202 posts
4 Dec 2017 3:09PM
Thumbs Up

Select to expand quote
nicephotog said..

FormulaNova said..
Seriously, everyone must be thinking it would be awesome to be pissed and have the self driving car drive you home.

Also on my favorites would be "driving" to Perth from Sydney. Not much of a drama really when you can watch TV, eat, sleep, read a book, as the miles pass by. I guess you would have to wake up for the fuel stops, but I think I could live with that. You could do it in record time.

Having said all this, I think the most impressive journey would be the work commute. No problems worrying about traffic when you are in the back having a nap or watching something on TV.









* ..."awesome to be pissed and have the self driving car drive you home"...
* ..."watch TV, eat, sleep, read a book, as the miles pass by"...
No actually, they did say that these vehicles do require "full awareness" while in use because some input is required for the vehicle to understand it is doing the right thing, and that is not true of being pissed , and not true in some split second emergency in some situations !
economictimes.indiatimes.com/small-biz/security-tech/technology/driverless-crashes/articleshow/56510821.cms

As a computer programmer myself i can tell you what to look for alike looking for the problem in a black box of an aircraft for information!
Sensors for example only read back what they get, and when constructed the a sensor unit and its specific job is calibrated to quality boundary of reception band.
However, sensors can go wrong as too components can change their values under particular physical environment conditions.
To make matters worse, the information is sent through what is known as a "core" of a CPU which is responsible for sorting, storing and matching data and committing operation upon that data in precedence simultaneously by which in that "scheme" one sensor is one thread (roughly) , and as management of threads in prioritization is called, to stop a thread is called an "interrupt". Prioritisation and backlogging can be a serious problem.

What they advertise in things such as driver-less buses in operation at this time is they have extremely defined routes. However one of those driver-less buses had quite a slammer "of its own failure" in the past year or two (and others that were not the driver-less vehicles fault e.g. recently in Las Vegas) !

My suggestion is that someone is sipping coffee at their same destination from same start place half way through your back breaker between Perth and Sydney !
windsolarhybridaustralia.x10.mx/PPL-light-aircraft.pdf

more... (not enough information by statistics as there are very few driver-less vehicles to commit that)
www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html

The greatest subtlety about driver-less vehicles and ordinary road accidents by human error is the immense complexity something is trying to "A.K.A." "understand and simulate in real world actions" is really too human a problem for the obscure unique events of environment change. Hazard a guess it would require some extremely human ideals of mind to "understand means to end to traverse some obstacles carefully and sensibly enough" in face of non standardized route occurring or freak weather event

(note - Believe it or not i actually know a vehicle should weigh "2 ton" ( 1814 . 37Kg ) on some good roads to not be blown off the road in good weather with a perfect good surface !!! ).

============== I do not give them the point , it would require a "super cradle computer" to properly assess the information at "time speed" required !
en.wikipedia.org/wiki/Supercomputer

NOT A TRUE LOOK BUT SOMETHING "VAGUELY COMPARITIVE"

one byte is (generally eight bits)

1 mega byte = 1048576 bytes

8 x 1048576 = 8388608 "bits" (one or zero) IN A MEGA-BYTE

120 Kmh = 3333.333333333333 milimeters - p/second

8Ghz (8 thousand million times per second) 8000000000

1/1000 th of a second 8000000

each "bit" (either one or zero) through the CPU instruction core is one cycle unit

THE FOLLOWING WOULD BE CLASSIFIED INFORMATION SO THESE VALUES ARE "SUPPOSE"
A "colour image" of 12 mega pixel can be said to be 12,000,000 x 3 = 36,000,000 (red,green,blue)
A radard image can be said to be grey scaled and 12,000,000 bytes
FOR IMAGING 48, MILLION BYTES OF INFORMATION

Simply to look at the picture will require buffering so to be putting it through the core will mean x 2 for "TIME"

2 x 48 million = 96,000,000 bytes

then interrupts must stop the thread to examine each eight byte set so an interrupt will come with the eigth bit making x 3 for "TIME"

but then the instruction index has to set the CPU core circuit to commit operation on the byte data using the stack and "program data"-CALLED INSTRUCTIONS that are sitting in a separate set of software byte data lcation

so we reach things logically x 6 (much more really) for time to read and operate on the data !!!


48 million x 6 = 288,000,000 bytes of DATA only total = 1728,000,000 bytes x 8 = ( 13824000000 bits (or one cycle beat))


1/1000 th of a second 8,000,000 cycles of CPU core beat

8,000,000 eight million times a "thousandth of a second"

13824000000 divided by 8,000,000

1728 sets of thousandths of a second for data

1728 / 1000 leaves sets of one second = 1.728 SECONDS


3333 (MM Per SECOND) x 1.728 = 5759 mm

================

SOMETHING LIKE .................. (approximately 5 meters traveled with an 8Ghz core CPU)

That's actually "a hell of a long way traveled" in popping the question on the console if something is wrong !!!

5.759 meters traveled (@ 120kMH) to complete the a basic SENSOR RECEPTION AND SOFTWARE PROCESSING !

================


I once figured that my hard disk over a few of hours at 5200 RPM to defrag my disk, the rim edge of the disk at 5 inch diameter travels around 700 Km to complete the task

GIANT suppose but there is some of the problem

The strategy computer scientists use to gain more CPU'S to make a virtual cradle super computer is to Network a set of computers.



Sure, but if you convert pi to binary you'll find the answer.

Ian K
WA, 4049 posts
4 Dec 2017 1:01PM
Thumbs Up

Select to expand quote
quikdrawMcgraw said..
Id rather drive myself anyday




Fair enough, it'll probably be legal to do so for some time yet. But given how the statistics are panning out would you rather the other cars sharing the road with you to be self-driving or driven by people?

rod_bunny
WA, 1089 posts
4 Dec 2017 3:40PM
Thumbs Up

Select to expand quote
Ian K said..

quikdrawMcgraw said..
Id rather drive myself anyday





Fair enough, it'll probably be legal to do so for some time yet. But given how the statistics are panning out would you rather the other cars sharing the road with you to be self-driving or driven by people?


Self driving...So long as they're allowed to do 110km on Indian Ocean drive

quikdrawMcgraw
1221 posts
4 Dec 2017 4:57PM
Thumbs Up

Select to expand quote
Ian K said..

quikdrawMcgraw said..
Id rather drive myself anyday





Fair enough, it'll probably be legal to do so for some time yet. But given how the statistics are panning out would you rather the other cars sharing the road with you to be self-driving or driven by people?


People

Ian K
WA, 4049 posts
4 Dec 2017 6:37PM
Thumbs Up

Select to expand quote
rod_bunny said..


Self driving...So long as they're allowed to do 110km on Indian Ocean drive


A self driver should be able to get away with going 10 kph over while everyone else gets booked.
www.theguardian.com/technology/2017/jan/16/tesla-allows-self-driving-cars-to-break-speed-limit-again

nicephotog
NSW, 251 posts
4 Dec 2017 9:53PM
Thumbs Up

Maybe if we were to take a look at what goes wrong seeing the world from other driving and vehicle mishap perspectives an answer for the Tesla accident could be extrapolated (a bit thought shotgunning).
NB: driver-less is Level 3 at least and is a way big jump at this time.

Wonder if this is a fail at the target parameters by an XXX robot.

dmitri
VIC, 1040 posts
5 Dec 2017 10:54AM
Thumbs Up

mobile.abc.net.au/news/2017-11-29/uber-style-driverless-cars-to-be-tested-in-perth-in-global-trial/9207120?pfmredir=sm


Let's see how it will work:
I order a "driverless" car via the app to take me from A to B.
There is a driver "human chaperone" behind the wheel.
At some stage the human chaperone takes hands off the wheel:
Human Chaperone : " Hey look, no hands !"
Me: " Wow, that's so cool"
Human Chaperone: " How would you like to pay ? We accept all major credit cards"..

And that's all, folks you can do with the "driverless" cars.

nicephotog
NSW, 251 posts
6 Dec 2017 2:19PM
Thumbs Up

dmitri said..
mobile.abc.net.au/news/2017-11-29/uber-style-driverless-cars-to-be-tested-in-perth-in-global-trial/9207120?pfmredir=sm


Let's see how it will work:
I order a "driverless" car via the app to take me from A to B.
There is a driver "human chaperone" behind the wheel.
At some stage the human chaperone takes hands off the wheel:
Human Chaperone : " Hey look, no hands !"
Me: " Wow, that's so cool"
Human Chaperone: " How would you like to pay ? We accept all major credit cards"..

And that's all, folks you can do with the "driverless" cars.




You can also take your XX robot back from the shop and park somewhere out of sight on the way , but if it's a "qvicky" and you need to keep moving, just be sure it doesn't go down on any of the manual controls!
(Don't forget to question your robot and driver-less car that they have'nt had cancer some time before).



Subscribe
Reply

Forums > General Discussion   Shooting the breeze...


"Things you'd do with a self driving car" started by FlySurfer