One of our Melbourne Data Scientists, Jeremiah Mannings, shares his thoughts on Amazon’s Machine Learning platform, SageMaker.
AWS has finally released SageMaker! (https://aws.amazon.com/sagemaker/) catching up in the market to other offerings such as Azure ML Studio. In typical Amazon style, they have taken all the good aspects of other automated Machine Learning cloud environments, improved it a little bit and made it cheaper and easier to integrate.
The market for automated ML solutions is really starting to ramp up. With SM it enables quick access to all AWS databasing systems and stores, meaning it’s even easier to bring organisational data into a machine learning model and use it in a familiar interface (jupyter notebooks). It also enables the models to be tuned and productionised outside of the development notebook, which solves a currently rather large pain point in the ML industry, quickly and effectively integrating models.
It’s not quite ready for worldwide usage (it’s mainly in the states at the minute) but it still is a fantastic service to get used to and start moving and experimenting with previous projects in a collaborative cloud. It makes it easy to switch between projects, as you can host as many notebooks as you want and scale them up and down as needed by the project.
The overview of the SageMaker Service on AWS.
The interface and usage of the notebooks is well polished for an initial version, and easy to adapt and work on if you are used to working on local notebooks. Typically a lot of development for these types of projects is done on hosted machines running local notebooks (which makes it hard to dynamically stop and start the notebooks), this circumvents the need for that by allowing installation of packages and usage of open source libraries on a hosted jupyter notebook with built-in integrations. It is essentially using tools that are already available, but packaging them in a much more usable, collaborative, adaptive, and quick way that makes spinning up ML projects take half the time.
Similarly to how Azure ML lets you publish to an API endpoint, SM lets you create a hosted API endpoint with an inference docker image so any model that’s built on the service can be hosted and accessed. This enables very efficient rapid prototyping for apps, something that was difficult previously with other services and local notebooks.
You should definitely check SageMaker out if you are into machine learning development, it provides a great platform with a lot of useful integrated tools. There is currently a 250 hour free deal on the lower tier of the product, which provides plenty of time to validate and discover how it works.
AWS CEO Andy Jassy said introducing the new service on stage at the re:invent conference “Amazon SageMaker, an easy way to train, deploy machine learning models for every day developers.”
Quick Use Case Ideas
Real-Time IoT predictions: Using data coming in from IoT databases on AWS, models can be quickly constructed using these large development datasets and tested with real, and valuable data as it comes in. For example, IoT weather prediction based on location and sensors could be done by constructing a number of models within SM and doing near real-time updated predictions on incoming data.
Financial Prediction: Data coming in from markets such as cryptocurrencies can be stored and dynamically and quickly built upon using machine learning and statistical models within the SM notebooks. This offers a particular advantage due to the direct and quick database connections that can be used for real time predictions and very quick backtesting and updating of models.
So far I’m pretty much in love with the capabilities of this service, and how much efficiency it adds to my work. Hopefully, as it starts being used and updated, AWS will add even more exciting features into the product that continually update and bring value into the space. SageMaker will definitely be one of my go-to tools for quick, efficient application and model development.