The curated garden problem

The curated garden problem

When Apple first enabled third party applications on the iPhone1, they were accused by many of violating user freedoms by creating a walled garden, that being the requirement that all applications installed on the platform must be installed via their iOS App store2. Apple promoted this as allowing application vetting for performance and security: particularly on earlier iPhone models with limited resources, this ‘curation’ process of vetting the reliability and safety of applications may very well have had a positive impact on the growth of the platform.

While Apple now allow alternate application installation – either for application beta testers, or for registered ‘enterprise’ application store experiences3, the application experience for the average iOS user remains the same now as it was when third party applications were originally allowed – install via Apple’s official Application store only.

Likewise, Google has also provided a curated application environment for their Android platform via the Play Store4. While the Android platform has more flexibility in allowing users to side-load applications that don’t originate from the Play Store, for a large number of Android users the application experience will be very similar to iOS users – applications will typically be sourced only from the official application store.

As May 2017 there were an estimated 2.2 million applications in the iOS App Store5, and Statista claims there are approximately 3.3 million applications in the Google Play store as of September 20176. In both cases, this represents an extraordinarily large number of applications for either mobile platform. The breadth of diversity in these applications is extreme – business applications, essential productivity utilities, complex games and inane ‘doodads’ designed to mildly amuse or distract for no more than a few minutes.

In November 2017, ZDNet reported on investigation performed via Reddit users into a fake WhatsApp7 application in the Play Store. Per ZDNet8:

A fake version of the Android WhatsApp app was downloaded a million times from the Google Play Store before users discovered the fraud, and Google removed it.

The application was named ‘Update WhatsApp’, used an icon identical to the WhatsApp icon, and masqueraded the alternate developer ID by including a non-breaking space at the end of the developer name (e.g., ‘WhatsApp Inc’ vs ‘WhatsApp Inc ‘). ZDNet reported this was not uncommon, stating9:

Avast mobile security researcher Nikolaos Chrysaidos discovered more bogus WhatsApp apps over the weekend. He’s also flagged several other fake WhatsApp apps on Google Play over the last month, including fake Facebook Messenger apps.

The Play Store is widely recommended as the safest place from which to install Android [apps] but Google has had trouble keeping it free of malware.

While the fake WhatsApp app in itself apparently did little harm, other fake applications have had more nefarious implications to unsuspecting users.


Rogue apps are in fact a common problem on all platforms. In 2016, the New York Times reported10:

Hundreds of fake retail and product apps have popped up in Apple’s App Store in recent weeks — just in time to deceive holiday shoppers.

The counterfeiters have masqueraded as retail chains like Dollar Tree and Foot Locker, big department stores like Dillard’s and Nordstrom, online product bazaars like and Polyvore, and luxury-goods makers like Jimmy Choo, Christian Dior and Salvatore Ferragamo.

As the number of applications on the various platforms have grown, so too has the problem of undesirable applications. In September 2010, Apple and RIM were starting to provide guidance on the sorts of applications that may not be allowed in their respective app stores11. (Both in particular felt that applications simulating flatulence had well and truly received enough developer attention and cautioned further apps dedicated to the function would not be welcomed into the stores.)

The problems facing mobile application store experiences are almost universally the same: fake reviews12, fake applications13, store fraud14, and, depending on the individual platform, outright malware15.

Companies like Apple and Google rely in no small way to enforceable developer agreements. Apple’s review guidelines16, for instance provides examples of the type of ‘objectionable content’ that will be rejected, including but not limited to:

1.1.1. Defamatory, discriminatory, or mean-spirited content, including references or commentary about religion, race, sexual orientation, gender, national/ethnic origin, or other targeted groups, particularly if the app is likely to humiliate, intimidate, or place a targeted individual or group in harm’s way.


1.1.4 Overtly sexual or pornographic material, defined by Webster’s Dictionary as “explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings.”

1.1.5 Inflammatory religious commentary or inaccurate or misleading quotations of religious texts.

Likewise, Google’s Play Store Policy17 provides a series of guides including information on restricted content, spam, store listings, privacy, impersonation and family-friendly app requirements. For example, the Restricted Content section states:

Sexually Explicit Content

We don’t allow apps that contain or promote sexually explicit content, such as pornography. In general, we don’t allow content or services intended to be sexually gratifying.

(It might be noted that hookup applications such as Grindr, Scruff and Tinder skirt such requirements by being promoted as online dating applications – true enough for some users, at least.)

While in early days of the various app stores, curation of submitted applications was a time-consuming process suggesting at least at that time a high  level of manual review, the sheer number of applications being added regularly to the iOS App and Google Play stores have, at least to a degree, shifted the focus onto self-regulation and end-user reporting. Undoubtedly, particular types of non-permitted applications might be caught on submission, based on word-searches, etc., and fraudulent bulk reviews may be deflected through common protection techniques, such as as IP address matching, or CAPTCHA18 interception. A 2014 study (when the number of iOS apps topped out at just over 1.2 million) suggested at that time applications were being added at a rate of up to 60,000 per month19. Application addition frequency is not constant – TechCrunch for instance reported 48,231 applications added in May 201620, but growth is strong in both the Apple and Google app ecosystems.

While Apple and Google both allow users to report applications they feel are in violation of either community or developer standards – or do not do as they claim to – the process may not always be as transparent as might otherwise be expected. For Apple’s iOS app store, for instance, the process seems to rely on a user navigating to the Apple Support Contact page21, and working through the contact mechanisms provided there. This is hardly self-explanatory, and requires a higher degree of effort and patience on the part of the user than one might expect.

Google’s Play Store provides a link to report apps suspected of violations with each application, but only at the very end of the application’s store page – which may take considerable scrolling once app details, reviews and additional advertising is taken into consideration. It must be stated that at least Google’s play store does include a reporting option with each action.

Both companies have repeatedly stated – either through their developer guidelines or in published reports – a desire to ensure their respective app store experiences deliver optimised user experiences, something which logically requires attention to malicious or fraudulent code and reviews. (Indeed, one might argue that this represents a higher priority over say, eliminating flatulence apps – particularly if there is a clear sign users want a diversity of such applications.)

It can be readily argued that Apple and Google, having established curated application marketplaces have both legal and financial imperatives for removing applications that can either imperil users (physically or financially), fake reviews, and malware. But are there ethical considerations? It’s important to understand that ethical considerations would likely exist regardless of legal/financial imperatives or operational practicalities.

Side Note:

It is perhaps worthwhile dwelling briefly on those operational practicalities, however – since the cost of providing a service is often seen as a valid reason to abrogate responsibility. Return to the number of applications reported added to the Apple iOS App store in May 2016 – 48,231. Even if this were now closer to the 60,000 per month predicted by adjust, does this represent a number too high to allow practical and human examination for miscreant behaviour? If we assume an average of 20 working days in a month for an employee, and 8 hour work days, 375 applications per hour would need to be reviewed in order to accommodate 60,000 per month. For a single employee that would clearly be untenable; however, teams of reviewers could process such a queue considerably faster. Devoting a minimum of 3.5 hours review time per application would see in individual employee able to review approximately 45 applications per month (at 20 work days per month), and thus 60,000 applications per month would require approximately 1,333 staff.

If the application stores were being provided gratis, simply to enable easy transactions between developer and consumer, such an overhead might seem onerous. However, both Apple and Google charge a reasonable percentage for transactions that take place within the stores – typically, 30%. Particularly for Apple, given iOS users are far more inclined to make purchases than Android users22, this can yield excellent revenue. Cult of Mac for instance reports that by 2016, the iOS App store had grown to contribute 3.4% of Apple’s revenue, with H1 2017 iOS App Store Revenue of approximately $4.8 billion23. Assuming a 1.6x loading on a base salary of $60,000 per annum, even 1,500 staff would have a bottom line impact of $144,000,000 per annum – easily sustainable given a half year revenue of $4.8 billion. Even if the operational costs from a staffing perspective just for application curation were double this, it would still not negatively affect such revenue amounts.

In what circumstances could it be considered that companies such as Apple and Google, who have both built pseudo-mandatory marketplaces for applications that enable functionality or entertainment on their devices, have an ethical obligation to minimise fraudulent activity within their ecosystems? Simplistically, it would be tempting to suggest that the utilitarianism model – the greatest happiness of the greatest number – neatly solves the exercise: since users would have increased happiness knowing no fraudulent activity can take place within the stores, an ethical obligation would be established, and the case could be closed. Yet once invoked, utilitarianism does present unfortunate side-effects – it might equally be argued that happiness would be increased further if all content within the application stores were also free of charge, etc.

To fully understand whether or not there may be ethical obligations for a strong focus on fraud minimisation in their ecosystems, one might instead focus on a comparison to other, more traditional “captive market” scenarios. In an increasingly globalised marketplace, such markets are less common than might have once been. An example that might be relevant would be a sole shopping mall in a reasonably isolated town. Traders operating within the shopping mall would pay rent to the mall provider, in order to subsequently sell products and/or services. The common presence of “cheap shops”, or “discount stores” within shopping centres might seem to suggest the shopping mall is under no real obligation to vet the quality of goods or services that its retail tenants provide. Equally, there is little evidence of shopping centres being sued for allowing retailers who sell fake goods. Instead, when goods that violate trademarks cause a lawsuit it is almost invariably one which is conducted directly between the trademark owner and the trademark violator. However, law and ethics do not have a 1:1 relationship. We might better then consider a hypothetical scenario where a retail tenant within the sole shopping centre in an isolated town starts advertising graphic pornography within the mall (particularly if that content would be visible to minors who of necessity, must visit the mall should they wish to partake in a shopping experience, given the lack of malls nearby). In a case such as this, there is likely to be an increasing weight of ethical obligation on the owner of the mall to act – they are providing a space for the public to patronise, after all.

If the pornography example is a relevant comparison, it is only likely to be a relevant comparison to vendors such as Apple and Google disallowing pornographic applications within their marketplaces – and for the same reason24. Our ‘captive market’ example as it currently stands does not directly apply to fraudulent activity as yet.

While cases trademark infringement typically result in a direct lawsuit between holder and infringer when it comes to ‘bricks and mortar’ environments, it has not always been the case for electronic sellers. In 2008 for instance, a French company successfully sued eBay25:

Online auction giant eBay has been convicted of selling counterfeit goods and ordered to pay 20,000 euro ($32,497) in damages to French luxury group Hermes, Hermes’ lawyer said.

This is perhaps a more direct – at least legally direct – comparison, in that eBay takes a specific cut from each item sold, as opposed to traditional shopping centres that usually charge retail tenants either a specific rental fee, or a combination of rental fee and general sales26 percentage. Likewise, both Apple and Google charge a percentage of each transaction that runs through their application stores, meaning they have a direct interest in each individual transaction that takes place. Again though, even if a legal obligation is established, we have not necessarily established an ethical obligation for the curtailment of fraudulent activity within the application ecosystems provided by Google or Apple.

Therefore in order to establish if there is an ethical obligation for the curtailment of fraudulent activity we will have to look elsewhere for inspiration. In this, we might turn our attention to the notion of trust, and a comparison to a different form of market – the pawnbroker. First, in establishing their ‘walled garden’ application marketplaces, Apple and Google have at least implicitly created a statement of trust between themselves and the consumer: content is curated and verified, therefore it should, to a degree, be deemed trustworthy.

Pawnbrokers and to a broader degree, second hand goods retailers have often been seen as a market for trading in stolen goods27:

South Australian Police are concerned about the role of second-hand dealers and pawnbrokers in the receipt and distribution of stolen property. The police believe that recidivist property offenders explore the opportunities available to commit crime. Often these offenders steal goods for resale.

The New South Wales guidelines for pawnbroking cautions about the risk of the service being used for trading in stolen goods and requiring particular records to be kept to allow the tracing of goods in the event of a crime being reported28. Finally, looking further abroad, the National Pawnbrokers Association of the USA provides an ethical code of conduct29, requiring amongst other behaviours, that members obey all applicable laws (of which not trading in stolen goods is quite common) and trading in an honest manner with the public.

Herein we see examples of ethical (in addition to legal) requirements being placed on a seller to ensure the goods being sold are not fraudulent. While not a ‘captive market’ situation, pawnbrokers and second-hand goods sellers are somewhat representative of businesses that ‘take a cut’ of through-transactions, albeit in a more delayed manner than smartphone application stores. Between eBay and Pawnbrokers/Second-hand stores we can see at least some leaning towards a general expectation that goods being sold are not specifically fraudulent – in the former, specifically dealing with fakes, and the latter, stolen goods.

Thus we might suggest that by comparison, Google and Apple may very well be ethically, as well as financially and legally obligated to curtail fraudulent activities within their application stores. Such an obligation would add to requirements around: (a) preventing wherever possible, such activity from taking place in the first place, (b) stopping activity if it is reported or otherwise detected, (c) refunding or compensating victims, and (d) providing consumers a simple mechanism to report fraudulent activity. Such processes are a mixed bag of success – it might well be argued for instance that due to the lack of malware within the iOS environment, it exceeds Google’s efforts for (a), yet it fails at (d), given the relative complexity involved in allowing consumers to report inappropriate applications.

IDC’s ‘Three Platforms’ definition30 (first: mainframe, second: client/server, third: mobile/social) gives us some insight into just how important any ethical obligations in the walled-garden ecosystems will be over time – each platform is typified by orders of magnitude more applications that might be used by consumers, thus increasing the number of potential vectors from which users might be adversely impacted. As such, it will likely require ongoing vigilance from users and independent security/data protection groups, as well as the likes of Apple and Google in order to limit user exposure to fraud, and this is an area that will undoubtedly receive increasing attention as smartphones and other highly portable computing devices grow in use, particularly where such use supplants older generation services.



  1. 2008
  2. App Store
  3. Install custom enterprise apps on iOS, Apple Support, May 2017
  4. Play Store
  5. How many apps are in the app store?, Sam Costello, May 5 2017, Lifewire
  6. Number of available applications in the Google play store from December 2009 to September 2017, Statista
  7. WhatsApp official web-site
  8. Fake WhatsApp app fooled million users on Google Play: Did you fall for it? Liam Tung, November 6 2017, ZDNet
  9. Ibid
  10. Beware, iPhone Users: Fake Retail Apps are Surging Before Holidays, Vinudo Goel, November 6 2016, New York Times
  11. Apple, RIM Agree: No more fart apps, Adam Rosen, September 30 2010, Cult of Mac
  12. Apple is Taking Action Against Fake Ratings on the App Store, Sarah Perez, June 13 2014, Tech Crunch
  13. Microsoft is cracking down on fake Windows Store apps, Tom Warren, August 27 2014, The Verge
  14. Google Looking to Crack Down on Play Store Fraud, Tyler Lee, 31 October 2016, Uberzigmo
  15. Massive Android Malware Outbreak Invades Google Play Store, Robert Hackett, September 14 2017, Fortune
  16. App Store Review Guidelines, Apple
  17. Developer Policy Center, Google
  18. Official CAPTCHA Site
  19. Birth, life and death of an app, July 2014, adjust
  20. App store to reach 5 million apps by 2020, with games leading the way, Sarah Perez, August 10 2016, TechCrunch
  21. Contacting Apple Support
  22. State of IAPs: iOS users spend 2.5x more on in-app purchases than Android users, Adam Sinicki, July 1 2016, Android Authority
  23. 2017 App Store Revenue crushes Apple’s entire 2007 earnings, Buster Hein, 27 June 2017, Cult of Mac
  24. It could be argued this is similar to ethical obligations for newspaper and magazine stores to place adult magazines out of viewing height of minors – at least to a reasonable degree – and limit visibility of any material outside of a general classification through sealed covers, etc.
  25. eBay fined over counterfeit goods, AFP, June 5 2008, Sydney Morning Herald
  26. As opposed to specific, individual sales
  27. Second-hand dealers and pawnbrokers act 1996 and regulations competition and regulatory assessment, South Australian Government
  28. Pawnbroking and second-hand dealers, NSW Fair Trading
  29. National Pawnbrokers Association Code of Ethics
  30. IDC Third Platform
%d bloggers like this: