On March 19, Kaspersky filed a complaint against Apple Inc. with the Federal Antimonopoly Service. Our claim pertains to Apple’s policy on apps distributed through the App Store. Despite a long history of working successfully with Apple, we believe that this is a necessary step.
Why Kaspersky decided to apply to the FAS
It was the following situation that compelled us to file the complaint.
Last year, we received a notice from Apple saying that our Kaspersky Safe Kids for iOS app does not meet the requirements of paragraph 2.5.1 of the guidelines for apps hosted in the App Store. Apple had never before had any issues with Kaspersky Safe Kids; the app had been hosted in the App Store, meeting all of the guidelines, for nearly three years.
It turned out that, according to Apple, the use of configuration profiles was against App Store policy, and Apple demanded that these be removed, so that the app could pass the review and be published in the store. For us, that would mean removing two key features from Kaspersky Safe Kids: app control and Safari browser blocking.
Both features are essential. The first allows parents to specify which apps kids cannot run based on the App Store’s age restrictions. The second allows the hiding of all browsers on the device, so kids can open Web pages only in Kaspersky Safe Kids’ built-in secure browser, which protects them from unsafe content.
So, by removing these two features from Kaspersky Safe Kids for iOS, we are massively letting down parents, who expect that their kids will be able to safely use iPhones and iPads that have our app installed. We believe it is essential that all of our customers, whether they are young or old, are completely safe and get exactly what they expect.
Why we believe we are right
The change in Apple’s policy toward our app (as well as toward every other developer of parental control software), notably came on the heels of the Cupertino-based company announcing its own Screen Time feature as part of iOS 12. This feature allows users to monitor the amount of time they spend using certain apps or on certain websites, and set time restrictions. It is essentially Apple’s own app for parental control.
From our point of view, Apple appears to be using its position as platform owner and supervisor of the sole channel for delivering apps to users of the platform to dictate terms and prevent other developers from operating on equal terms with it. As a result of the new rules, developers of parental control apps may lose some of their users and experience financial impact. Most important, however, it is the users who will suffer as they miss out on some critical security features. The market for parental control apps will head toward a monopoly and, consequently, stagnation.
One might argue that the App Store is owned by Apple itself, so why should the company not call the shots? The problem is that Apple does not allow the use of any other software marketplaces for iOS, so it effectively controls the only channel for delivering apps from developers to users.
By setting its own rules for that channel, it extends its power in the market over other, adjacent markets: for example, the parental control software market, where it has only just become a player. It is precisely in this extension of its leverage through possession of so-called “key capacity” over other segments, leading to restriction and elimination of competition, that we see the essential elements of antitrust law violation, which consist of erecting barriers and discriminating against our software.
We have repeatedly tried to contact Apple to resolve this situation, but no meaningful negotiations have ensued.
We are not alone
The issue of a ban on the use of configuration profiles has, to varying degrees, affected every developer of parental control apps, not just us. This is, however, not the only issue that developers have with Apple. For example, Spotify recently filed a complaint against Apple with the European Commission, similarly claiming that the Cupertino company has been using its monopoly for advancing its services without giving others a chance to compete on equal terms.
Other developers of parental control solutions, who have lost the ability to restrict access to apps, are also less than thrilled. For example, AdGuard has found itself in a similar situation. And to name another example, the parental control app Kidslox can still be downloaded from the App Store, but updates have not passed Apple review, so they cannot get into the marketplace.
We found ourselves in a similar situation with Microsoft back in the day, but a recourse to the regulator helped us to solve the problem and go on working with the company on terms that were acceptable to both the entire cybersecurity industry and users.
We very much hope that we will also be able to continue our winning relationship with Apple, and that requires us to create an environment where Kaspersky and other companies compete on equal footing. The environment is very different at the moment, which is why we are in the process of applying to the Federal Antimonopoly Service.
Why monopolies are a bad thing
Development and progress are possible only amid healthy competition — that is, where companies that create similar products must come up with something to draw users to their idea. When a dominant entity emerges in the market, it immediately starts to set its own rules, which everyone has to follow. In many cases, these rules put many at a disadvantage, leaving them with no choice. Progress virtually stalls as a result, with the monopolist, facing no competition, not being compelled to evolve its ideas.
It is for the purpose of resolving situations such as this that regulators such as the Federal Antimonopoly Service exist, and issues of this type have traditionally been resolved at the government level. Nations are deeply concerned about the issue of monopolies; countries are interested in the healthy development of as many companies as possible, not just the biggest one.
Recently, for example, United States Senator Elizabeth Warren proposed prohibiting large companies that become monopolies, such as Facebook (which, incidentally, owns Instagram and WhatsApp), Google, Apple, and Amazon, from publishing apps on their own platforms. Warren proposes breaking up the big tech companies and prohibiting them from promoting their products on platforms they own. As things stand, they’re likely to give preference to their own products by default. Indeed, who wouldn’t?
The United States has been through that before, for example, with Standard Oil Co., which produced, transported, refined, and marketed petrochemicals. When a similar situation unfolded one hundred years ago, the U.S. government decided to break up Standard Oil. It split into a number of smaller companies, and the breakup eventually doubled the collective value of Standard Oil stock.
We are therefore confident that we are right and that our initiative will benefit the market at large. We very much hope that Apple will provide competitive terms to third-party developers, so that it can continue its winning relationship with the company and the advancement of progress.
Update: April 29, 2019
Recent New York Times article stated that “two of the most popular parental-control apps, Kidslox and Qustodio, filed a complaint with the European Union’s competition office.” We are glad that not one but two of our competitors decided to step forward. We see this filing as an evidence that current situation is recognised as inappropriate by many of the market participants.
Interestingly, Apple reacted to the New York Times article, first unofficially with an e-mail that was published by MacRumors and was apparently written by Phil Schiller, and then with official statement. In their response Apple blames parental control apps for using the so-called MDM technology. Not getting into technical details, this technology indeed looks very attractive for parental control app developers. However we never used MDM in Safe Kids.
Update: June 10, 2019. Regarding recent changes to the Apple App Store Review Guidelines
The New York Times reported recently that Apple has softened its position on third-party parental control software. We welcome the changes stated in the App Store Review Guidelines. These guidelines have become more responsive to iOS application developers’ needs, and they have attempted to clarify how parental control applications should be developed and then assessed by Apple.
However, the new edition of the “Apple Developer Enterprise Program License Agreement,” dated June 3, 2019, for independent developers, states that the use of Mobile Device Management (MDM) profiles and configuration profiles in applications for home users is possible only with the explicit written consent of Apple.
Given this information, as well as our previous experience of interacting with Apple, we expect to receive official written confirmation of the applicability of the new p.5.5. “App Store Review Guidelines” for Kaspersky Safe Kids for iOS, the company’s application for parental control. We made the appropriate request to Apple, but we have yet to receive a clear answer.
Until that happens, we are not prepared to change anything stated in our application to the Russian Federal Antimonopoly Service (FAS), and we continue to prepare a complaint to the European Commission.
Update: August 8, 2019. Regarding the FAS opening an antimonopoly investigation against Apple
This morning, the Federal Antimonopoly Service of Russia announced that they had started an investigation against Apple based on our antimonopoly complaint. Here is the company’s official statement regarding this.
Kaspersky welcomed the recent changes stated in the App Store Review Guidelines. These guidelines have become more responsive to iOS application developers’ needs, and attempted to clarify how parental control applications should be developed and then assessed by Apple.
The updated App Store Review Guidelines allow utilizing MDM for parental controls in limited cases, and the Apple Developer Enterprise Program License Agreement clarifies that the use of MDM profiles and configuration profiles in applications for home users is possible only with the explicit written consent of Apple.
However, Apple’s updated rules and restrictions do not provide clear criteria allowing the usage of these profiles, as well as information on meeting the criteria, which are needed for obtaining written consent from Apple to use them.
Apple prohibits the transfer of data received from applications using MDM to third parties. It makes no exceptions and insists that such a ban will apply to the use of third-party analytical services, which are widespread in the mobile application industry, and allow the improvement of software products based on statistical analysis.
It is noteworthy that our case is not about the transfer of user data, especially from children’s devices, but only from the parent’s device and only regarding the work of our software; and only with the explicit and informed consent of users. However, Apple has not heard our arguments.
Utilizing these analytical services is critical for mobile application developers, and Apple does not provide such tools. This means that, in the event of a real ban on using such services, Apple’s Screen Time will have competitive advantages that are not available to the applications of other players in this market.
To summarize, we can state that the softening of the conditions Apple announced leaves a number of questions unanswered, and it does not allow us to claim elimination of the abuse of its market position by Apple.
In particular, this abuse remains in the following:
- Apple did not provide third-party iOS application developers the unconditional opportunity to use the technologies and APIs used or ones similar to those used in Apple’s own Screen Time application.
- Apple did not provide a transparent and clear procedure for obtaining the explicit written consent to use MDM profiles and configuration profiles. To obtain consent to use MDM and configuration profiles, third-party developers must pass a review with unclear deadlines and incomprehensible selection criteria.
- Apple’s requirements reduce the competitiveness of third-party parental control software with Apple’s Screen Time. Among Apple’s requirements for testing, there is an apparent ban on the use of any third-party analytical services in applications, even with a user’s explicit consent. Apple’s requirement does not contain information about the necessary justification of such a ban on utilizing these services that are used by the entire mobile software development industry.
- Apple stated that consent to use the MDM profiles will be provided for a period of one year. At the same time, it is not indicated whether the same application requirements will be applicable after one year. The current situation allows Apple to change the rules of review and selection criteria at any time, as well as introduce new rules or interpret existing rules differently. This has been exemplified repeatedly in the past.
- Apple reserves the right to remove any application from the App Store at any time, without explanation, even those applications that satisfied all the requirements and passed the review. This rule is enshrined for Apple in the relevant documents, and it is an imposition of discriminatory conditions on third-party iOS application developers.
All in all, one-way communication, the marketing and technological advantages of Screen Time (which are not available to third-party developers of similar software), nontransparency of procedures, and Apple’s officially enshrined right to take any actions at all regarding third-party software, all create certain barriers for third-party developers to enter the market of parental control software for iOS, and these barriers are the result of the dominant entity’s — in this case, Apple’s — actions. These actions adversely affect the competitive environment of an adjacent market in which the dominant entity also offers its product.