Top 10 AI New Year’s Resolutions for 2018
Looking forward to keeping all of them!
Number 10: Any Cloud, Any Platform: AI has progressed in leaps and bounds. But, anything fast focused on enabling a specific capability and not supporting a general case. Generalizing AI to enable more and more capabilities requires the abstraction from infrastructure specifics or cloud specific requirements.
Number 9: A Closer Working Relationship between Big Data and AI: For more than a few years many people have pointed to Big Data as a solution looking for a problem. Pretty much all AI services need a lot of data at the very least in the training phase. Tighter inter-operability between Hadoop ecosystem capabilities and the TensforFlow/Keras ecosystem capabilities will make a strong marriage or at least a good working relationship.
Number 8: Better Feature Extraction with AI during ETL for AI. It is hard enough to perform data munging as preparation for input into a Deep Neural Net. Extracting features from any data type still requires too much tailoring. Everyone says this phase takes way too long, lets improve in 2018.
Number 7: Systematic access to training data. AI is clearly no longer a skunkworks project within businesses. “Hacking” at data does not securely scale. Systematic role based access to secure data in a common no-sql repository for training is a requirement for business AI in 2018.
Number 6: Better Environments to run Machine Learning Libraries. TensforFlow, Keras, PyTorch, Theano, Caffe, and others have greatly facilitated the AI Innovator. But, getting these libraries to work in a business environment still presents significant IT challenges.
Number 5: Orchestration and Compliance. Yes, bringing up this seems like I’m being a party-pooper. But, let’s get this off the table with AI Platforms that hit all the compliance checks in the box.
Number 4: APIs, APIs, APIs. We all know AI is big, the only way AI usage scales is through better, faster, simpler, more common APIs for “tbd” usage of AI results.
Number 3: Auto-tuning of DNNs. I like graduate students as much as anyone. But, already, this has been a graduate student task for too long. In the fall of 2017, seeing more papers about this, let’s make it a reality for AI Innovators in 2018.
Number 2: Improved horizontal scaling of DNNs. Rightfully so, a lot of talk about GPUs, TPUs, and when CPUs are OK. But, it all points to the fact that better results are being generated from bigger DNNs that require new ways of thinking about how to quickly generate the quality of results needed for specific applications and implementation of AI.
Number 1: Better ways to serve AI results as applications. 2018 should bring more ways for AI Innovators to market their results beyond showcasing their skillsets to the big AI employers.
Happy New Year!