Apple’s New Satellite Service launched today
1. Apple involvement with satellite communication company
In Apples latest service launch they teamed up with Satellite company Globalstar (NYSE: GSAT) and Cobham Satcom to provide connectivity to and from iPhone 14 and iPhone 14 Pro devices.
“A $450 million investment from Apple’s Advanced Manufacturing Fund provides the critical infrastructure that supports Emergency SOS via satellite for iPhone 14 models. Available to customers in the US and Canada beginning later this month, the new service will allow iPhone 14 and iPhone 14 Pro models to connect directly to a satellite, enabling messaging with emergency services when outside of cellular and Wi-Fi coverage.”
"In 2021, Apple announced an acceleration in its US investments, with plans to make new contributions of more than $430 billion over a five-year period."
2. How the SOS system works
"When an iPhone user makes an Emergency SOS via satellite request, the message is received by one of Globalstar’s 24 satellites in low-earth orbit traveling at speeds of approximately 16,000 mph. The satellite then sends the message down to custom ground stations located at key points all over the world."
"The ground stations use new high-power antennas designed and manufactured specifically for Apple by Cobham Satcom in Concord, California. Cobham’s employees engineer and manufacture the high-powered antennas, which will receive signals transmitted by the satellite constellation. Along with communicating via text with emergency services, iPhone users can launch their Find My app and share their location via satellite when there is no cellular and Wi-Fi connection, providing a sense of security when off the typical communications grid."
"Once received by a ground station, the message is routed to emergency services that can dispatch help, or a relay center with Apple-trained emergency specialists if local emergency services cannot receive text messages."
3. Where is the SOS system available
Emergency SOS via satellite is available in the US and Canada starting today, November 15, and will come to France, Germany, Ireland, and the UK in December.
Links:
https://www.apple.com/newsroom/2022/11/emergency-sos-via-satellite-made-possible-by-450m-apple-investment/
https://www.apple.com/newsroom/2022/11/emergency-sos-via-satellite-available-today-on-iphone-14-lineup/
https://support.apple.com/en-us/HT213426
https://ast-science.com
https://investors.globalstar.com
Things discussed in the first pod is AI, our personal data, self driving cars. These ultimately all are future technologies that will integrate more into our lives but we need to be mindful of the ethical challenges that we will face as we are exposed to these algorithms.
The company behind the popular dating/social app, Bumble, have released an open-source project to detect mens penis’s that are being sent to the DMs. They have released this open-source project to help combat sexual harassment that run amuck in the this filthy digital space.
Now I wonder how they were able to obtain their training dataset for the Artificial Intelligence to learn from? How did they obtain consent of users to have their sexual organ saved and used by a company to build an application with? Does the machine doing the machine learning get harassed in the process?
Apple has released a iOS 16 Security patch for a security vulnerability.
According to the official there was an ability for code to be executed with elevated permissions which could be used to run malicious code on a iPhone users phone. For more details on the security patch: https://support.apple.com/en-us/HT213489
“Buzz, buzz buzz” my phone vibrates as I receive yet another notification today.
How long do I wait before I look?
What could it be?
The notification could’ve been triggered by any of the plethora of applications I have installed on my phone.
The tension builds and I begin to feel anxious like a child on Christmas Eve waiting for the clock to strike midnight. Now, my smartwatch heart rate monitor is triggered and warns me of having and elevated heart rate.
So, I give in.
I reach for the phone and the enchanting black mirror illuminates to life. I see the message it had waiting for me, which says, “Rain expected in 20 minutes at your location.”
-_-
The buzzing and humming of our phone occurs multiple times a day, and it has gradually become the norm. We’ve become so dependent on our electronics and their algorithms while placing a blinding trust in their utility. However, let’s not forget these 4 times algorithms f*cked humanity.
Y2K Bug
“The Commerce Department’s $100 billion estimate covers the cost of testing and repairing computers affected by the Y2K problem from 1995 to 2001. It does not, however, include the money that businesses have spent on new machines to replace older ones that have date glitches, which economists say could provide some long-term economic benefits through productivity gains.
The Commerce estimate also doesn’t take into account firms’ Y2K-related publicity campaigns or the possible cost of litigation stemming from undiscovered glitches. As a result, some economists believe overall Y2K spending probably is somewhat higher, perhaps closer to $150 billion.”
To save some bits of data, some software engineers simply used the last two digits to denote the year. This forced companies to retroactively review code used in production to implement fixes to resolve issues caused by the missing two leading digits of precision in the year.
Overall, the y2k fix was needed to prevent issues that would come up in applications such as scheduling, trend analysis and time-based calculations. For example, banking applications that calculate interest on your bank account or tap into your 401k with or without penalty.
“We found that single-vehicle collisions comprised the majority of crashes (51 cases, 32% of overall incidents), with crashes with other vehicles (26 cases, 17%) and crashes with pedestrians and bikes (13 cases, 8%) making up the remainder of crash incidents.”
Lin, Allen & Kuehl, Kate & Schöning, Johannes & Hecht, Brent. (2017). Understanding “ Death by GPS “ : A Systematic Analysis of Catastrophic Incidents Associated with Personal Navigation Technologies. 10.1145/3025453.3025737.
People who drive to places that are not familiar to them trust in GPS algorithms to get them to their dream destinations, but end up in nightmare situations. Traveling the great unknown is something humanity has been doing for millions of years using primitive methods such as following the North Star to modern day GPS applications. However, GPS has made traveling so much easier that we sometime take for granted the complexity involved in finding a path to our destination, but is it the safest?
One unfortunate fatal account comes from a Canadian couple in 2011 looking to take a road trip from their home in British Columbia to Las Vegas, which ended in tragedy because they depended on GPS to help navigate their trip to Sin City. Sadly, the couple took a route through the desert, which the GPS suggested was the best route, and ended up getting stuck in thick mud from the rough terrain they encountered along the way.
The husband, Albert Chretien (59), left his wife, Rita Chretien (56), in the vehicle while he went to search for help to get their vehicle out of the thick mud that they were unable to get themselves out of. Rita was found 7 weeks later in the vehicle, 30 pounds lighter, was rushed to the nearest hospital and was able to recover fully. However, Rita’s husband, Albert, was found dead about a year later by hunters.
“Krafcik (John Krafcik, CEO of Waymo (owned by Google’s parent)) said that you have to be sensitive to the real losses people suffer in accidents such as these. But, he added, achieving the bigger picture — eliminating the roughly 35,000 annual auto fatalities, largely due to driver error — means not being deterred by the “bumps in the road” to accident-free driving. Krafcik was basically putting a fresh coat of paint on an old, rarely spoken platitude: People must get killed en route to a better, safer transportation system.”
Korman, Richard. “Give Us the Risk Reality.” ENR: Engineering News-Record, Aug. 2018, p. 52.
The future certainly is here and the wealthiest man in the world now owns one of the trail blazing companies introducing “autonomous” machines to our roads. However, with “beta” version software being deployed in the Tesla Auto-pilot, one has to question the transparency to consumers and general public of the risks involved in using these algorithms that are controlling 2 ton metal machines armed with li-ion batteries on wheels doing 70 MPH on a road near you. It’s a marketing genius but one which puts people in danger by misleading customers by stating a Tesla is a “self-driving” vehicle. According to the 5 level classification of autonomous vehicles where 0 is lowest level and 5 being full autonomy; a Tesla, at best, is level 2.
With the slick marketing, a Tesla customer was unfortunately too reliant on these “self-driving” features and ended in a fatal accident with a truck.
“According to internal materials reviewed by The Intercept, Dataminr meticulously tracked not only ongoing protests, but kept comprehensive records of upcoming anti-police violence rallies in cities across the country to help its staff organize their monitoring efforts, including events’ expected time and starting location within those cities. A protest schedule seen by The Intercept shows Dataminr was explicitly surveilling dozens of protests big and small, from Detroit and Brooklyn to York, Pennsylvania, and Hampton Roads, Virginia.”
Biddle, Sam. “Police Surveilled George Floyd Protests With Help From Twitter-Affiliated Startup Dataminr.” The Intercept, 9 July 2020, theintercept.com/2020/07/09/twitter-dataminr-police-spy-surveillance-black-lives-matter-protests.
What should we do when we witness injustice? Usually, we think of spreading information, organizing and protesting. However, since we are using the World Wide Web, the powers that be also have access to your ideas that you are spreading about. Now, with social media platforms already tracking and cataloging your data, it made it much easier to suppress activism. For instance, there is a company, Dataminr, that reportedly tracked BLM activist on social media and was helping the police now of these activist planned activities.
These four events have occurred, yet, we are still alive and breathing. However, we should reflect on these events so that we learn not to have blind faith in applications or their creators. We trust these applications we use that we often neglect to read the terms and agreements. We are so quick to click the “I Agree” checkbox and “Next” button to finally have access to the latest application everyone is buzzing about, but don’t know what we traded to gain access to this application. The old adage goes “if something is too good to be true that’s because it probably is”, yet we haven’t applied it to the technology we use.
Life is more complex than what any algorithm can currently predict or handle. We, the human element, are the counter balance to make sure that technology and technological companies do not make life-altering decisions without our final say. Using our empathy and compassion, which is what separates us from machines, we will need to provide the checks and balances necessary to prevent questionable use of these algorithms that are so integrated in our lives. I ask that my fellow software engineers and the tech companies that employ us be more transparent with AI being deployed and used on society, so that the public gives their own risk assessment on the decisions being made by AI potentially based off of training data sets that may be bias or skewed.