Filed under: 1984, augmented earth, Big Brother, biometrics, cameras, cell phone, cell phones, CNN, Control Grid, Darpa, data mining, Dictatorship, Echelon, Empire, global elite, global government, Globalism, google, google earth, government control, government takeover, gps, internet, Internet 2, internet 3, internet of things, internet police, IOT, Media, microchip, microchips, Microsoft, nanny state, New World Order, NWO, Oppression, orwell, Pentagon, Police State, RFID, RFID bracelet, Science and technology, south korea, Spy, Surveillance, surveillance cameras, Total Information Awareness, traffic cameras, uav, Verichip | Tags: intel, internet regulation, korea, motorola, National Intelligence Council, new songdo city, NIC, seoul, u-city, Ubiquitous computing, Ubiquitous living, Ubiquitous positioning, utopia, Video and Image Retrieval and Analysis Tool
Editor’s Note: This could be the start of the New World Order MATRIX, where every ‘thing’ in the world can be located and tracked on the internet
Augmented Google-Earth Tracks Real-Time People, Cars, Weather
Cryptogon
September 30, 2009
The surveillance side of this is the chickenfeed. There’s something far more sinister than the simple surveillance… an angle we haven’t heard about yet.
Tice never did tell his story to Congress about this different aspect of the program.
Well, my guess is that it has something to do with providing surveillance data for this SEAS World Sim thing, and that individual Americans are being watched and potentially targeted with it. Tice’s background seems to involve a lot of traditional electronic warfare, radar and ELINT stuff. Maybe Tice’s deal involved the collection of the mobile phone GPS and/or triangulation data which would provide realtime spacial/geographic data to the SEAS system. In other words, SEAS sees you. They could bring up a map of a city and plot your path based on the information that your phone is exchanging with the mobile network.
—Synthetic Environments for Analysis and Simulation
Via: Popular Science:
Researchers from Georgia Tech have devised methods to take real-time, real-world information and layer it onto Google Earth, adding dynamic information to the previously sterile Googlescape.
They use live video feeds (sometimes from many angles) to find the position and motion of various objects, which they then combine with behavioral simulations to produce real-time animations for Google Earth or Microsoft Virtual Earth.
They use motion capture data to help their animated humans move realistically, and were able to extrapolate cars’ motion throughout an entire stretch of road from just a few spotty camera angles.
From their video of an augmented virtual Earth, you can see if the pickup soccer game in the park is short a player, how traffic is on the highway, and how fast the wind is blowing the clouds across the sky.
Up next, they say they want to add weather, birds, and motion in rivers.
Ubiquitous Computing: Big Brother’s All-Seeing Eye