I am attempting to orchestrate downloads using the Dropbox Python API and Google Cloud's Cloud Composer. I am trying to download the files off my dropbox account, unzip them and them upload them to my Bigquery warehouse for analysis. Does anyone have a more efficient way of accomplishing this? I am currently stuck at downloading my files off Dropbox using the API. It seems as if my calls are succesful however the files do not appear in my local drive. I have tried every conceivable combination for the download path including using cloud storage url's and local file paths. A sample of the code to download the file is as follows:
def download_files_in_folder():
"""Download a file from Dropbox to the local machine."""
dbx = connect_to_dropbox()
files = dbx.files_list_folder(dbxPath)
for entry in files.entries:
print(entry.path_lower)
dbx.files_download_to_file(r"Users\damian.ohienmhen\\"+entry.name,entry.path_lower)
print('downloaded file')
Can anyone help me?
Addendum: I attempted to run the code using only the Google Cloud Command line and ran it as a standalone python script. The following code worked below:
for entry in files.entries:
print(entry.path_lower)
dbx.files_download_to_file(r"/home/dohmhen/"+entry.name+entry.name,entry.path_lower)
print('downloaded file')
Is there a way to specify a file download path while running it in the cloud composer environment? This same script fails in the environment with the following message:
FileNotFoundError: [Errno 2] No such file or directory: '/home/dohmhen/2022-06-29_7-20-04 pm_hk_1656544804.zip2022-06-29_7-20-04 pm_hk_1656544804.zip'