How To Automate File Export From FTP On Cloud Engine To Cloud Storage
I have built an FTP server on Google Cloud Compute Engine where several users send data to a directory on this FTP as follows:
user1 send data to
user2 send data to
user3 send data to
I want to automate moving data from user1, user2 and user3 to storage buckets with their names whenever they add new file to their directory (the storage will be as an archive for this data).
My question is:
Is it possible to use Cloud Functions to do that? And what trigger can be used in this case?
Also, if there is any example out there that can help me to understand the process it would be great.
Thanks in advance.
This can be accomplished by using a Background Cloud Function along with a Cloud Storage trigger since it can be set to monitor a specific bucket for new files and execute load script whenever the trigger is fired.
More info in the links provided above.
Let me know if it helps.
- → What are the pluses/minuses of different ways to configure GPIOs on the Beaglebone Black?
- → Django, code inside <script> tag doesn't work in a template
- → React - Django webpack config with dynamic 'output'
- → GAE Python app - Does URL matter for SEO?
- → Put a Rendered Django Template in Json along with some other items
- → session disappears when request is sent from fetch
- → Python Shopify API output formatted datetime string in django template
- → Shopify app: adding a new shipping address via webhook
- → Shopify + Python library: how to create new shipping address
- → shopify python api: how do add new assets to published theme?
- → Access 'HTTP_X_SHOPIFY_SHOP_API_CALL_LIMIT' with Python Shopify Module