[DAEXIM-16] Provide option to batch data getting imported Created: 11/Nov/19 Updated: 15/Nov/19 Resolved: 15/Nov/19 |
|
| Status: | Resolved |
| Project: | daexim |
| Component/s: | Import |
| Affects Version/s: | None |
| Fix Version/s: | magnesium |
| Type: | New Feature | Priority: | Medium |
| Reporter: | Ajay Lele | Assignee: | Ajay Lele |
| Resolution: | Done | Votes: | 0 |
| Labels: | None | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Description |
|
When size of data getting imported in very large, the import operation fails due to timeout etc. Options like strict-data-consistency=false help to a some extent but not beyond certain data size. Provide option whereby data getting imported can be ingested in smaller batches so that import operation does not result in one giant write operation but instead gets broken down into smaller writes. |