Thursday, 8 December 2011

Writing an #API for #AdTotum from scratch

I've been thinking over the last couple of months about the value of protecting high performance, high-scale production-environmental systems from bugs that can be introduced when new features, products and models are added to the system.

I've been advising on adding additional inference models using the RAP - RDF API to the AdTotum T745 system as well as some Markov chain models and extra geo-weather and search trend data sources. The question was - we get so many new data sources at AdTotum and need to test and validate in real time - how can we do that stably and retain scalability? The answer seems to me to be to build a robust internal API - a challenge that needs real disciple and deep understanding of the code and the likely product roadmap to be able to do without impacting performance, security or scalability. It's something that other companies I've been involved with have shied away from and not wanted to take on the commitment, but on careful reflection of the top-down needs I've been inspired to sit down and write a complete #API for #AdTotum from scratch. Over the past 24 hours I've been designing the architecture and running some code scalability tests to make sure that it really makes sense.

http://www.linkedin.com/company/adtotum

2 comments:

  1. Beautiful - the geo-weather API works perfectly already through the GetData API call!

    ReplyDelete
  2. Just to clarify - that means that all the modeling systems now have access to the current weather and temperature in full or tokenized form for any individual user's (or indeed viewer) right now - in real time.

    ReplyDelete