Data Lake Analytics makes the complex task of managing distributed infrastructure and complex code easy. It dynamically provisions resources and lets you do analytics on exabytes of data. When the job completes, it winds down resources automatically, and you pay only for the processing power used. As you increase or decrease the size of data stored or the amount of compute used, you don’t have to rewrite code. Many of the default limits can be easily raised for your subscription by contacting support.
Resource | Default Limit | Comments |
---|---|---|
Maximum number of concurrent jobs | 20 | |
Maximum number of Analytics Units (AUs) per account | 250 | Use any combination of up to a maximum of 250 AUs across 20 jobs. |
Maximum script size for job submission | 3 MB |