Context
We are currently designing a middleware app that will be keeping the file system of another application in sync with a Dropbox because the organizations want to leverage Dropbox features and storage on top of what the other app is offering. We have a single customer in mind at the moment, but will be offering our product to other organizations using this particular app as well.
We are monitoring application events and running scheduled processes for all of your typical file system events. Nothing crazy. Except that since it is an application -level sync for an organization, the amount of data being moved is not insignificant.
Questions (numbered only for response reference)
- I assume that each organization using this sync will need to create a service account created that will act as the user for our middleware app. Correct assumption?
- I've read about the Data Transport Limits that apply to business accounts of various levels. Our customers would always fall into that group. We have an internal system that can manage against external api limits; however, since this limit is not just for our own calls, what are best practices for mitigating availability? Are there different limits by plan type or some other mechanism? We expect our users each to be handling about 500k-1M uploads per month (not including other calls not relevant to the data transport limit). Any other considerations?
- Your developer docs indicate that the app is not eligible for production review until it has 50 users. No problem. In our case we will have one "user" per customer syncing their entire other app to Dropbox. I just want to confirm that this doesn't change that.
- Any other questions we should be asking?