SECURITY WEEK | By Eduard Kovacs – The controversial case in which the FBI asked Apple to unlock the iPhone belonging to the San Bernardino Islamic terrorist shooter appears to have ended after the law enforcement agency announced that it managed to hack the phone with the aid of an outside party.
The FBI reportedly achieved the task with the help of Israel-based mobile forensics firmCellebrite. Many have called for the agency to inform Apple about the method used to unlock the phone so that the company can take steps to protect its customers, but experts believe that is unlikely to happen and reports have already surfaced about the FBI agreeing to help prosecutors hack other Apple devices.
The U.S. government has been accused of trying to create a precedent when it attempted to convince a judge to force Apple to create a backdoor to the iPhone, with many experts arguing that well-resourced agencies like the NSA could likely hack the shooter’s phone.
After the FBI announced that it no longer needs Apple’s help, the agency’s director James Comey denied the accusations, saying that they never lied about their ability to access the phone and the case was not about trying to send a message or set a precedent.
Industry professionals contacted by SecurityWeek have analyzed the case and shared thoughts on what technique could have been used to hack the iPhone, whether or not Apple should be informed about it, and how much such an exploit costs.
And the feedback begins…
Carl Leonard, Principal Security Analyst, Forcepoint (formerly Raytheon|Websense):
“Details of the ‘exploit’ the FBI reportedly used to access this iPhone are not well known. However, past events offer a sense of how valuable this knowledge could be in the market. A widely-stated figure for Apple 0-days is around $250,000; the going rate for browser exploits is typically 5 times less than that. In November 2015, Zerodium offered up to $1m dollars for an iOS exploit.
Pricing, as always, is about supply and demand. Sellers have to be willing to sell and buyers have to be willing to buy. Bidding wars come into play and prices decline over time as protection for a discovered exploit is offered, thus reducing the effectiveness of the malicious code. Other market factors that come into play on pricing include how easy it is to create the exploit, how reliable it is, how easy it is to use, the number of susceptible platforms it affects and how well it is detected by security vendors.”
Chenxi Wang, chief strategy officer, Twistlock:
“It’s entirely possible that there is a vulnerability that nobody knew about. It’s possible that the FBI was able to discover it and develop an exploit for the vulnerability. Software is extremely complex, and even Apple cannot provide 100 percent security. It’s not out of the question.
If it is a vulnerability, it’s likely in the public interface, allowing someone to hack it without special help from the manufacturer. I’m sure Apple can find it themselves if they look hard enough — they have plenty of resources. However, if the FBI does discover a vulnerability, they should share it with Apple in the spirit of responsible disclosure. We hope they do.
Also, that vulnerability is likely going to be tens of thousands of dollars on the black market, depending on what it allows you to access. It could even be as high as hundreds of thousands of dollars if it allows you to take control of the entire firm.”
Ed McAndrew, former federal cybercrime prosecutor, currently a lawyer at Ballard Spahr:
“This battle has been interrupted, not diffused. Because so many of the AWA applications are under seal, the “next” case is likely on some judge’s desk now, although it may not involve Apple. Because different judges will undoubtedly reach different rulings, perhaps on similar facts, we can expect geographical disparity on cyber-investigative law across the country. Such disparities, particularly where such compelling and competing public interests exist, often are best resolved by Congressional, not judicial, action.
The case raises the question whether data can be “too secure for law enforcement.” Are we elevating individual data security over public safety?
In many ways, the AWA worked perfectly in this case. The manufacturer was able to contest the government’s assertions as to the burden and necessity of assistance. When Apple’s assistance was no longer necessary, the particular controversy before the court was resolved.
The case illustrates the compelling and countervailing interests in data security of one person’s device versus the broader interest in public safety. Greater data security for consumers necessarily includes greater data security for criminals. Technological and legal notions of data security are not necessarily the same.”
Nathan Wenzler, Executive Director of Security, Thycotic:
“With the FBI rescinding its case against Apple to provide access to the San Bernardino shooter’s phone, questions still remain about how the FBI was “suddenly” able to gain access without Apple’s help. Many have speculated on who, and how, but most experts agree that some sort of previously unknown exploit was used to either bypass the 10-try lockout bomb or to circumvent all of the security controls to scrape the information directly from memory. Whatever the method, it seems safe to say that the FBI now possesses this exploit and will reuse it in similar situations. It is, essentially, the same end result they were hoping to achieve by getting the courts to force Apple to provide them the means to get into iPhones like this on their own. Contrary to some media outlets, the original court order made it clear that the FBI wanted Apple to create software that would allow them access and would require Apple to turn it over to them once complete. However, with public sentiment turning against them and a strong, and likely unexpected, legal resistance from Apple, the FBI simply turned to another means to get what they needed from another outside source.
Now, what this exploit is, exactly, is going to be left open to a lot of discussion. More importantly, though, is whether or not Apple will figure out what the exploit is and fix it in a timely manner. This may be a near impossible thing to accomplish, given the FBI’s almost-certain unwillingness to disclose what this exploit is in the name of good security and overall benefit to the populous. Apple may simply need to take a more broad approach and increase the strength of their encryption methods and security protocols in all their devices to such a point where it becomes infeasible for exploits against the login lockout method or memory scraping tools to even work, regardless of access. It’s a tricky endeavor, and it still doesn’t guarantee complete security, but it may be the only reasonable means that Apple will have to try and fix whatever exploit the FBI has obtained. The war between security and privacy has only just begun.”
Eric Lundbohm, CMO, iSheriff:
“The recent reversal of the FBI’s legal action against Apple is purportedly because they found another method to open the iPhone 5C owned by the San Bernardino shooter. This is indeed a likely scenario, as there are many exceptionally smart security people in the world and the infamy of this case likely brought opportunists from the hacker or security worlds.
Now the tables have been turned on Apple, who is considering the legal route to force the FBI to divulge specifics of the technique used to open the phone in question. Odds are that they will figure it out at some point or retain the firm that helped the FBI to help them understand the hack and close any opening that has been discovered.
It’s more likely that the method used focused on the iTunes software link that could be used to restore the phone under more normal conditions. Spoofing this, and using various techniques with a “safe” iPhone 5C until it works, seems a plausible technique to find a hack in.”
Elad Yoran, Executive Chairman, KoolSpan:
“If the FBI has a working exploit they used to access the iPhone at issue, the agency should report the exploit to Apple. However, I am not optimistic that the government will do this. Meanwhile, the chances that Apple will figure out how the FBI accessed the iPhone are very good. Apple has tremendous assets it brings to the table – some of the best engineers, including security experts, tremendous financial resources, a well-defined privacy and security vision, and a long-term view of product management and their relationship with Apple customers. Taken together, these factors virtually guarantee that the company will not only figure out this particular vulnerability, but also make their devices more secure over time.”
Ken Basore, SVP, Product Engineering, Guidance Software:
“My opinion is that the FBI knows generally what the compromise was, but don’t have all of the details. If they actually worked with Cellebrite, I know those guys and they will protect their IP from getting out as long as possible so that they have a competitive advantage. I would assume the agreement with the FBI was that they would not have to disclose details of how the compromise works.
There is no doubt in my mind that Apple will figure it out, assuming that they don’t already know about it. Tim Cook will devote millions of dollars to fix a compromise, especially now that it’s public.
The most likely candidate is a way to compromise the operating system. This is how almost all companies are able to ‘bypass’ the security on any mobile device. They figure out a vulnerability and exploit it. Apple learns about it and patches it. They try to find another one. And so on.”
James Bindseil, President and CEO, Globalscape:
“Some feel the Justice Department’s withdrawal of its case against Apple is a win for privacy and civil liberties, but the fact remains, whether the FBI has acquired the ability and expertise to crack iPhones or not, it has a resource that is apparently willing to work with the agency to gain access to the data. This means that the debate over whether the government can compel private companies to give them access—whether it’s through a “master key” type code, weak encryption or the creation of novel exploits—is far from over. And while the rush to identify and replicate the exploit is on, Apple is also at work figuring out how to make their phones more secure. That’s the vicious cycle of security.”
Michael DeCesare, CEO, Forescout:
“We need to first recognize the magnitude of this dispute between Apple and FBI on encryption — it represents a significant shift from 10, even five years ago in how the public sector interacts with the government. In the past, companies would not hesitate to give law enforcement what they requested, but now we are experiencing a shift in power — granting more to consumer and technology companies, exposing out just how far behind the private sector is.
As devices become more personalized and capable of holding even more of our sensitive data, it’s critical that we develop a consistent framework that can be applied to these types of situations as they arise. This is an opportunity to jumpstart a meaningful debate on the way law enforcement — and the government in general — can collaborate with the private sector. It doesn’t have to be this difficult.
With that said, if the FBI does in fact have the exploit, it makes sense for them to share it with Apple, as it will only strengthen security in the future. Society needs to change its current mindset of us vs. them, good vs. bad, the people vs. the government. At the end of the day, we are all on the same side. We all want security and privacy — and to achieve this, the private sector needs to continue to work in tandem with technology companies. We need collaboration across the board, not segmentation — which seems to be how we are currently operating. And as we’ve just learned, our current approach just isn’t working.”
Kevin Watkins, CTO and Co-Founder, Appthority:
“Anything from the security community is still speculative [regarding the technique used to unlock the San Bernardino shooter’s iPhone]. That being said, it could be a 0-day vulnerability. More likely, and what the security community is siding with, is the NAND mirroring technique was used. This is where the Apple platform data holding the keylock data is physically extracted off the device, the key combinations are attempted, then copied back to the physical device if it locks up.
If the NAND mirroring technique was used then Apple was likely aware of this attack method. Later iPhone models (starting with the 5s) have a “security enclave” or physical hardware on the device which is tamper proof and does key management and cryptographic operations. Basically, it wouldn’t be a simple copy off a chip like the NAND mirroring technique, any attempt to get to the data would brick the device.”
Oren Falkowitz, Area 1 Security CEO and ex-NSA analyst:
“What the FBI did in trying to compel the solution is a huge misstep by the agency. It’s not surprising that people across the globe would come out of the woodwork now – for money or publicity.
By making the unlock issue so public, the FBI is undermining the security of technology people use every day, and has put the public at greater risk of compromise and attack.
The U.S. government should be aligned with tech companies to create more secure products, better trust with public-private partnerships, and do everything possible to encourage a safer and more secure Internet.”
Read the original article in SecurityWeek here.