{"Algorithms & APIs"}

Exploring The Economics of Wholesale and Retail Algorithmic APIs

I got sucked into a month long project applying machine learning filters to video over the holidays. The project began with me doing the research on the economics behind Algorithmia's machine learning services, specifically the DeepFilter algorithm in their catalog. My algorithmic rotoscope work applying Algorithmia's Deep Filters to images and drone videos has given me a hands-on view of Algorithmia's approach to algorithms, and APIs, and the opportunity to think pretty deeply about the economics of all of this. I think Algorithmia's vision of all of this has a lot of potential for not just image filters, but any sort of algorithmic and machine learning API.

Retail Algorithmic and Machine Learning APIs
Using Algorithmia is pretty straightforward. With their API or CLI you can make calls to a variety of algorithms in their catalog, in this case their DeepFilter solution. All I do is pass them the URL of an image, what I want the new filtered image to be called, and the name of the filter that I want to be applied. Algorithmia provides an API explorer you can copy & paste the required JSON into, or they also provide a demo application for you to use--no JSON required. 

Training Your Own Style Transfer Models Using Their AWS AMI
The first "rabbit hole" concept I fell into when doing the research on Algorithmia's model was their story on creating your own style transfer models, providing you step by step details on how to train them, including a ready to go AWS AMI that you can run as a GPU instance. At first, I thought they were just cannibalizing their own service, but then I realized it was much more savvier than that. They were offloading much of the costly compute resources needed to create the models, but the end product still resulted in using their Deep Filter APIs. 

Developing My Own API Layer For Working With Images and Videos
Once I had experience using Algorithmia's deep filter via their API, and had produced a handful of my own style transfer models, I got to work designing my own process for uploading and applying the filters to images, then eventually separating out videos into individual images, applying the filters, then reassembling them into videos. The entire process, start to finish is a set of APIs, with a couple of them simply acting as a facade for Algorithmia's file upload, download, and DeepFilter APIs. It provided me with a perfect hypothetical business for thinking through the economics of building on top of Algorithmia's platform.

Defining My Hard Costs of Algorithmia's Service and the AWS Compute Needed
Algorithmia provides a pricing calculator along with each of their algorithms, allowing you to easily predict your costs. They charge you per API call, and the compute usage by the second. Each API has its own calculator, and average runtime duration costs, so I'm easily able to calculate a per image cost to apply filters--something that exponentially grows when you are applying to 60 frames (images) per second of video. Similarly, when it comes to training filter models using AWS EC2 GUP instance, I have a per hour charge for compute, storage costs, and (now) a pretty good idea of how many hours it takes to make a single filter. 

All of this gives me some pretty solid numbers to work with when trying to build a viable business built on top of Algorithmia. In theory, when my customers use my algorithmic rotoscope image or video interface, as well as the API, I can cover my operating costs, and generate a healthy profit by charging a per image cost for applying a machine learning texture filter. What I really think is innovative about Algorithmia's approach is that they are providing an AWS AMI to offload much of the "heavy compute lifting", with all roads still leading back to using their service. It is a model that could quickly shift algorithmic API consumers to be more wholesale / volume consumers, from being just a retail level API consumer.

My example of this focuses on images and video, but this model can be applied to any type of algorithmically fueled APIs. It provides me with a model of how you can safely open source the process behind your algorithms as AWS AMI and actually drive more business to your APIs by evolving your API consumers into wholesale API consumers. In my experience, many API providers are very concerned with malicious users reverse engineering their algorithms via their APIs, when in reality, in true API fashion, there are ways you can actually open up your algorithms, make them more accessible, and deployable, while still helping contribute significantly to your bottom line.

See The Full Blog Post


Pushing For More Algorithmic Transparency Using APIs

I saw the potential for collaboration when it came to using web APIs back around 2004 and 2005. I was seeing innovative companies opening up their digital assets to the world using low-cost, efficient Internet technology like HTTP, opening things up for potentially interesting approaches to collaboration around the development of web and mobile applications on top of valuable digital resources. This approach has brought us valuable platforms like Amazon Web Services and SalesForce. 

Common API discussions tend to focus on providing APIs to an ecosystem of developers and encouraging the development of web and mobile applications, widgets, visualizations, and other integrations that benefit the platform. In the course of these operations, it is also customary to gather feedback from the community and work to evolve the APIs design, available resources, and even the underlying data model--extending collaboration to also be about the APIs, and underlying resources, in addition to just building things on top of the API.

This approach to designing, defining, and deploying APIs, and then also web and mobile applications on top of these APIs is nothing new, and is something that I have been tracking on for over the last six years. The transparency that can be injected into the evolution of data, content, and potentially the "algorithms behind" with APIs is significant, which is how it became such a big part of my professional mission, and fueling my drive to spread the "gospel" whenever and wherever I can. 

Ok, so how can APIs contribute to algorithmic transparency? To fully grasp where I am taking this, you need to understand that APIs can be used as an input and output for data, content, as well as algorithms. Let's use Twitter as an example. Using Twitter and the Twitter API I can read and write data about myself, or any user, using the /account and /users API endpoints--providing the content and data portion of what I am talking about.

When it comes to the algorithm portion, Twitter API has several methods, such as GET statuses/user_timelineGET statuses/home_timeline and GET search/tweets, which return a "timeline of Tweet data". In 2006 this timeline was just the latest Tweets from the users you follow, in sequential order. In 2016, you will get "content powered by a variety of signals". In short, the algorithm that drives the Twitter timeline is pretty complicated, with a number of things to consider:

  • Your home timeline displays a stream of Tweets from accounts you have chosen to follow on Twitter. 
  • You may see suggested content powered by a variety of signals. 
  • Tweets you are likely to care about most will show up first in your timeline. 
  • You may see a summary of the most interesting Tweets you received since your last visit
  • You may also see content such as promoted Tweets or Retweets in your timeline.
  • Additionally, when we identify a Tweet, an account to follow, or other content that's popular or relevant, we may add it to your timeline.

There are a number of considerations that would go into any one timeline response--this is Twitter's algorithm. While I technically have access to this algorithm via three separate API endpoints, there really isn't much algorithmic transparency present, beyond their overview in the support section. Most companies are going to claim this is their secret sauce and their intellectual property. That is fine, I don't have a problem with y'all being secretive about this, even though I will always push you to be more open, as well as leave the API layer out of your patents you use to pretect your algorithms.

Algorithmic transparency with APIs is not something that should be applied to all APIs in my opinion, but for regulated industries, and truly open API solutions, transparency can go a long way, and bring a number of benefits. All Twitter (and any other API provider) has to do is add parameters, and corresponding that open up the variables of the underlying algorithm for each endpoint. What goes into considerations about "what I care about", constitutes "interesting", and what makes things "popular or relevant"? Twitter will never do this, but other API providers can.

It is up to each API provider to decide how transparent they are going to be with their algorithms. The ideal solution when it comes to transparency is that the algorithm is documented and shared along with supporting code on Github, like Chicago, did for their food inspection algorithm. This opens up the algorithm, and the code behind for evaluation by 3rd parties, potentially improving upon it, as well as validating the logic behind--potentially opening up a conversation about the life of the algorithm.

There are a number of common reasons I have seen for companies and developers not opening up their algorithms:

  • It truly is secret sauce, and too much was invested to just share with the world.
  • It is crap, and the creator doesn't want anyone to know there is nothing behind.
  • There are malicious things going on behind the scenes that they do not want to be public.
  • Insecurities about coding abilities, security practices and logic applied to the algorithms.
  • Exist in competitive space with lots of bad actors, and may want to limit this behavior.
  • What is accomplished isn't really that defensible, and the only advantage is to keep hidden.

I have no problem making an argument for algorithmic transparency when it comes to regulated industries, like financial, healthcare, and education. I think it should be default in all civic, non-profit, and other similar scenarios where the whole stack should just be open sourced, and available on Github. You won't find me pushing back to hard on the startups unless I see some wild claims about the magic behind, or I see evidence of exploitation, then you will hear me rant about this some more.

Algorithmic transparency can help limit algorithmic exploitation and the other shady shit that is going on behind the scenes on a regular basis these days. I have added an algorithm section to my research, and as I see more talk about the magic of algorithms, and how these amazing creations are changing our world--I am going to be poking around a bit, and probably asking to see more algorithmic transparency when I think it makes sense.

See The Full Blog Post


The Opportunity For API-Driven Algorithmic Transparency At The Mobile Data Plan Level

API Evangelist is focused on helping push for sensible API-driven transparency wherever I can get it. When done in sensible ways an API can crack open the often black box that is the algorithm, giving us access and more control over our online experience.

One of the most significant algorithmic bottlenecks that govern our daily lives is our mobile data plans. All of our mobile phones are governed at the data plan level--this is where the telecom companies make their money, throttling the bits and bytes we depend on each day. 

The mobile data plan is a great place to discuss the algorithmic and data transparency that APIs can assist with, and one example of this in action is with the Google Mobile Data Plan API. Google wants more access at this level to improve the quality of experience for end-users when using mobile applications like Youtube, which can be severely impacted by data plan limits, while also significantly impacting your data plan consumption if not optimized.

There is so much opportunity for discussion between mobile network operator, API providers, developers, consumers at the data plan level. I know mobile network operators would rather keep this a black box, so they can maximize their revenue, but when you crack the network layer open with a publicly available API, there will be a number of new revenue opportunities.

Data plans affect all of us, every single day. We need more transparency into the algorithms that meter, limit, and charge us at the mobile data plan layer. We need the platforms we depend on each day to have more tools to optimize how applications consume (or do not consume) this extremely (seemingly) finite, and valuable resource (thanks, telcos!). 

I've added an algorithms area to my research to keep an eye on this topic, curate stories I find, and share my own thoughts when it comes to algorithmic transparency using APIs.

See The Full Blog Post


Keeping A Window Open Into How Power Flows Within Algorithms Using APIs

I just read The Pill versus the Bomb: What Digital Technologists Need to Know About Power, by Tom Steinberg (@steiny), and I'm reminded of the important role APIs will (hopefully) continue to play in helping provide a transparent window into some of the power structures being coded into the algorithms we are increasingly relying on in this digital world we are crafting.

In this century, we are seeing a huge shift in how power flows, and despite the rhetoric of some of the Silicon Valley believers, this power isn't always being democratized along the way. Much of the older power structures is just being re-inscribed into the algorithms that drive network switches, decide pricing when purchasing online, via our online banking, and virtually ever other aspect of our personal and business worlds.

APIs give us a window into how these algorithms work, providing access to 3rd party developers, government regulators, journalists, and many other essential actors across our society and economy. Don't get me wrong, APIs are no magic pill, or nuclear bomb, when it comes to making algorithmic power flows more transparent and equitable, but when they are done right, they can have a significant effect.

If APIs are a complete (or near complete) representation of the algorithms that are driving platforms, they can be used to better understand how decisions behind the algorithmic curtain are made, and exactly how power is flowing (or not) on web, mobile, and increasingly connected device platforms--API does not equal perfect transparency, but will help prevent all algorithms from being black boxes.

We may not fully understand Uber's business motivations, but through their API we can test our assumptions. We may not always trust Facebook's advertising algorithm, but using the API we can develop models for better understanding why they serve the ads they do. Drone operators may not always have the best intentions, but through mandatory device APIs, we can log flight times and locations. These are just a handful of examples that APIs can be used to map out digital power.

All of this is one of the main reasons that I do API Evangelist. I feel like we have a narrow window of opportunity to help ensure APIs act as this essential transparent layer for ALL API operations across industries. As the established power structures (eye of Sauron) turn their attention to the web, and increasingly APIs, their powers of transparency are becoming more diminished. It is up to us API Evangelists, to help make sure APIs stay publicly available to 3rd party developers, government, journalists, end users, and other key players--providing much needed transparency into how algorithms work, and how power is flowing on the web and mobile Internet.

See The Full Blog Post