OCI Object Storage SDK: how to to Upload Large Files Due to Timeout Errors
I'm working on a project and hit a roadblock. I'm testing a new approach and I've looked through the documentation and I'm still confused about I'm sure I'm missing something obvious here, but I'm experiencing issues when trying to upload large files (over 100MB) to OCI Object Storage using the OCI Python SDK (version 2.36.0)..... The upload process frequently fails with a `TimeoutError: The operation timed out` message. I've tried increasing the timeout settings in my configuration, but it doesn't seem to help. Hereβs a snippet of the code I'm using: ```python import oci config = oci.config.from_file() # Load the config from the default config file object_storage_client = oci.object_storage.ObjectStorageClient(config) namespace = object_storage_client.get_namespace().data bucket_name = 'my_bucket' file_path = 'path/to/large_file.zip' object_name = 'large_file.zip' with open(file_path, 'rb') as f: # Setting a custom timeout, but it doesn't seem to work response = object_storage_client.put_object( namespace, bucket_name, object_name, f, timeout=1200 # 20 minutes timeout for large files ) ``` Iβve also tried splitting the file into smaller chunks using multipart uploads, but I encounter similar timeout issues with that approach as well. Any advice on how to resolve these timeout errors or best practices for handling large file uploads in OCI would be greatly appreciated. Are there specific configurations I might be overlooking or best practices for using multipart uploads efficiently? This is part of a larger CLI tool I'm building. I'd really appreciate any guidance on this. For context: I'm using Python on Linux. What's the correct way to implement this? Could this be a known issue?