Sources with the BigML Dashboard
10 Size Limits
BigML does not impose any limits on the number of sources you can upload to a single account or on the number of sources you can assign to a specific project. Each source can store an arbitrarily-large number of instances and also manage a relatively big number of fields. For example, the BigML multi-tenant version can process datasets with hundreds of millions of rows and dozens of thousands of fields.
The BigML multi-tenant version does impose some limits on the total size of files, depending on the way you bring your data to BigML:
- Local sources:
files uploaded directly including through the browser, drag and drop, or through the API are limited to 64 GB in size.
- Remote sources:
files uploaded using any of the accepted protocols defined in section 8.1 are also limited up to 64 GB; however using Amazon Simple Storage Service (S3), the limit is 5 TB.
- Inline sources:
sources created using the online editor are limited to 16 MB.
If yours is a case where the machine learning-ready data exceeds these size limits, please consider a BigML Private Deployment that can raise those limitations and be tailored to manage bigger datasets.