We have more than 10+ years experience in big data industry, and help lots of enterprises to initialize their first big data platform and workflow. Customers are enjoying with our professional services.
Various modern technologies of big data are being used by us, like Spark, Hadoop, Flint, Storm, Kafka and NOSQL etc.
We are focusing to create a framework that can be easily used in the development. Developers are not required to be familiar with the big data technologies, they just need to focus on the development of their business logic. The framework has included the optimization of the big data job. It shrinks the executions plan, reduces the data shuffling up to 70%.
To developers, they even are not required to write any spark code, the transformation logic is the only requirement that they need to complete. Once they done the business logic, then the efficient spark job can be easily deployed into their cluster and start to create the desired results.
we can process 1T data within 15 minutes with complex business logic.