One of the great improvements of Magento 2 is its new API, which can be easily leveraged by users and is easy to extend for Magento programmers. The automatically generated Swagger documentation makes the REST interface even more practical.
The Magento 2 API covers all main import/export flows of a web shop: creation and update of simple and complex products, categories, customers. The full checkout process can be controlled via the API. What is more, Magento 2’s standard checkout process uses the very same checkout API in the background.
Even though the Magento 2 API is a huge step forward, as of Q3 2018 one painful issue remains: It suffers from serious performance and scalability shortcomings, mainly because it lacks an optimized bulk API for catalog import. Bulk import becomes crucial when web shops are connected to external ERPs or warehouse management systems. The problem becomes more severe for webshop with high numbers of products and frequent product data changes to be pushed to Magento.
Creating a bulk API that is both efficient and scalable and fully substitutes the standard API is a big challenge as catalog updates are not simple database inserts but trigger a number of calculations, database operations and cache invalidations essential for presenting the products on the frontend. Magento’s unique features such as tier and group pricing, multiple websites and stores, multiple currencies, catalog and cart price rules make these operations even more complicated. In addition, Magento has an advanced observer and plugin system to enable extension developers to ‘piggyback’ on system events and procedures while adding their own custom logic.
To solve all these issues and raise Magento to an enterprise level, a community driven project was started on github. The goals and design of the initiative are absolutely brilliant:
- have an API which accepts multiple products to be persisted a time
- internal implementation of the Bulk API should guarantee that the products which are coming into Bulk API are persisted without deadlocks.
- an implementation may do performance optimization of the Bulk API persistence but moreover, it should be backward compatible with current customizations around persistence operations.
- It must be possible for the client to send multiple Bulk API requests in parallel
- It should be possible for Magento to process multiple Bulk API workers in parallel
- Allow for support of other entities in the future
- HTTP requests must complete quickly. Processing of bulk data is performed asynchronously.
On top of all this, the project aims to support bulk API with
- Incremental Import with checking the changes and saving only the data that changed
- Parallel to this, enable to process partial data pushed to the API
- Processing bulk data pushes asynchronously via queues
- Plug in these optimizations to any import/export process including csv imports
- Provide support for model events and plugins
- Supporting multiple persistence layers in addition to Mysql
With 10+ years of Magento community experience + hundreds of ERP integrations, these goals seem both valid and achievable. The project is progressing impressively and with huge activity in the community. ITG looks forward to its presentation on MagentoLive Europe in Barcelona and will keep you updated.
As Chief Technology Officer, Ferenc leads the technology department at ITG Commerce, Ferenc is a recognized Magento Master and certified Shopware Developer.