Azure
- pydantic model rohmu.object_storage.config.AzureObjectStorageConfig
Show JSON schema
{ "title": "AzureObjectStorageConfig", "type": "object", "properties": { "storage_type": { "title": "Storage Type", "default": "azure", "enum": [ "azure" ], "type": "string" }, "statsd_info": { "$ref": "#/definitions/StatsdConfig" }, "bucket_name": { "title": "Bucket Name", "type": "string" }, "account_name": { "title": "Account Name", "type": "string" }, "account_key": { "title": "Account Key", "type": "string" }, "sas_token": { "title": "Sas Token", "type": "string" }, "prefix": { "title": "Prefix", "type": "string" }, "is_secure": { "title": "Is Secure", "default": true, "type": "boolean" }, "host": { "title": "Host", "type": "string" }, "port": { "title": "Port", "type": "integer" }, "azure_cloud": { "title": "Azure Cloud", "type": "string" }, "proxy_info": { "$ref": "#/definitions/ProxyInfo" } }, "required": [ "account_name" ], "definitions": { "MessageFormat": { "title": "MessageFormat", "description": "An enumeration.", "enum": [ "datadog", "telegraf" ], "type": "string" }, "StatsdConfig": { "title": "StatsdConfig", "type": "object", "properties": { "host": { "title": "Host", "default": "127.0.0.1", "type": "string" }, "port": { "title": "Port", "default": 8125, "type": "integer" }, "message_format": { "default": "telegraf", "allOf": [ { "$ref": "#/definitions/MessageFormat" } ] }, "tags": { "title": "Tags", "default": {}, "type": "object", "additionalProperties": { "anyOf": [ { "type": "integer" }, { "type": "string" } ] } }, "operation_map": { "title": "Operation Map", "default": {}, "type": "object", "additionalProperties": { "type": "string" } } }, "additionalProperties": false }, "ProxyType": { "title": "ProxyType", "description": "An enumeration.", "enum": [ "socks5", "http" ], "type": "string" }, "ProxyInfo": { "title": "ProxyInfo", "type": "object", "properties": { "host": { "title": "Host", "type": "string" }, "port": { "title": "Port", "type": "integer" }, "type": { "$ref": "#/definitions/ProxyType" }, "user": { "title": "User", "type": "string" }, "pass": { "title": "Pass", "type": "string" } }, "required": [ "host", "port", "type" ], "additionalProperties": false } } }
- Config
arbitrary_types_allowed: bool = True
extra_forbid: bool = True
use_enum_values: bool = True
- Fields
- Validators
- field account_key: Optional[str] = None
- Validated by
- field account_name: str [Required]
- Validated by
- field azure_cloud: Optional[str] = None
- field bucket_name: Optional[str] = None
- Validated by
- field host: Optional[str] = None
- Validated by
- field is_secure: bool = True
- Validated by
- field port: Optional[int] = None
- Validated by
- field prefix: Optional[str] = None
- Validated by
- field sas_token: Optional[str] = None
- Validated by
- field storage_type: Literal[StorageDriver.azure] = StorageDriver.azure
- Validated by
- validator host_and_port_must_be_set_together » all fields
- validator valid_azure_cloud_endpoint » azure_cloud
- class rohmu.object_storage.azure.AzureTransfer(bucket_name: str, account_name: str, account_key: Optional[str] = None, sas_token: Optional[str] = None, prefix: Optional[str] = None, is_secure: bool = True, host: Optional[str] = None, port: Optional[int] = None, azure_cloud: Optional[str] = None, proxy_info: Optional[dict[str, Union[str, int]]] = None, notifier: Optional[Notifier] = None, statsd_info: Optional[StatsdConfig] = None, ensure_object_store_available: bool = True)
- close() None
Release all resources associated with the Transfer object.
- copy_file(*, source_key: str, destination_key: str, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) None
Performs remote copy from source key name to destination key name. Key must identify a file, trees cannot be copied with this method. If no metadata is given copies the existing metadata.
- create_object_store_if_needed() None
Create the backing object store if it’s needed (e.g. creating directories, buckets, etc.).
Raise Rohmu-specific error TransferObjectStoreInitializationError to abstract away implementation-specific details.
- get_contents_to_fileobj(key: str, fileobj_to_store_to: BinaryIO, *, byte_range: Optional[Tuple[int, int]] = None, progress_callback: Optional[Callable[[int, int], None]] = None) Dict[str, Any]
Like get_contents_to_file() but writes to an open file-like object.
- get_file_size(key: str) int
Returns an int indicating the size of the file in bytes
- verify_object_storage() None
Perform read-only operations to verify the backing object store is available and accessible.
Raise Rohmu-specific error TransferObjectStoreInitializationError to abstract away implementation-specific details.
Google
- pydantic model rohmu.object_storage.config.GoogleObjectStorageConfig
Show JSON schema
{ "title": "GoogleObjectStorageConfig", "type": "object", "properties": { "storage_type": { "title": "Storage Type", "default": "google", "enum": [ "google" ], "type": "string" }, "statsd_info": { "$ref": "#/definitions/StatsdConfig" }, "project_id": { "title": "Project Id", "type": "string" }, "bucket_name": { "title": "Bucket Name", "type": "string" }, "credential_file": { "title": "Credential File", "type": "string", "format": "path" }, "credentials": { "title": "Credentials", "type": "object" }, "proxy_info": { "$ref": "#/definitions/ProxyInfo" }, "prefix": { "title": "Prefix", "type": "string" } }, "required": [ "project_id" ], "definitions": { "MessageFormat": { "title": "MessageFormat", "description": "An enumeration.", "enum": [ "datadog", "telegraf" ], "type": "string" }, "StatsdConfig": { "title": "StatsdConfig", "type": "object", "properties": { "host": { "title": "Host", "default": "127.0.0.1", "type": "string" }, "port": { "title": "Port", "default": 8125, "type": "integer" }, "message_format": { "default": "telegraf", "allOf": [ { "$ref": "#/definitions/MessageFormat" } ] }, "tags": { "title": "Tags", "default": {}, "type": "object", "additionalProperties": { "anyOf": [ { "type": "integer" }, { "type": "string" } ] } }, "operation_map": { "title": "Operation Map", "default": {}, "type": "object", "additionalProperties": { "type": "string" } } }, "additionalProperties": false }, "ProxyType": { "title": "ProxyType", "description": "An enumeration.", "enum": [ "socks5", "http" ], "type": "string" }, "ProxyInfo": { "title": "ProxyInfo", "type": "object", "properties": { "host": { "title": "Host", "type": "string" }, "port": { "title": "Port", "type": "integer" }, "type": { "$ref": "#/definitions/ProxyType" }, "user": { "title": "User", "type": "string" }, "pass": { "title": "Pass", "type": "string" } }, "required": [ "host", "port", "type" ], "additionalProperties": false } } }
- Config
arbitrary_types_allowed: bool = True
extra_forbid: bool = True
use_enum_values: bool = True
- Fields
- field bucket_name: Optional[str] = None
- field credential_file: Optional[Path] = None
- field credentials: Optional[Dict[str, Any]] = None
- field prefix: Optional[str] = None
- field project_id: str [Required]
- field storage_type: Literal[StorageDriver.google] = StorageDriver.google
- class rohmu.object_storage.google.GoogleTransfer(project_id: str, bucket_name: str, credential_file: Optional[TextIO] = None, credentials: Optional[dict[str, Any]] = None, prefix: Optional[str] = None, proxy_info: Optional[dict[str, Union[str, int]]] = None, notifier: Optional[Notifier] = None, statsd_info: Optional[StatsdConfig] = None, ensure_object_store_available: bool = True)
- close() None
Release all resources associated with the Transfer object.
- copy_file(*, source_key: str, destination_key: str, metadata: Optional[Dict[str, Any]] = None, **_kwargs: Any) None
Performs remote copy from source key name to destination key name. Key must identify a file, trees cannot be copied with this method. If no metadata is given copies the existing metadata.
- create_object_store_if_needed() None
Create the backing object store if it’s needed (e.g. creating directories, buckets, etc.).
Raise Rohmu-specific error TransferObjectStoreInitializationError to abstract away implementation-specific details.
- get_contents_to_fileobj(key: str, fileobj_to_store_to: BinaryIO, *, byte_range: Optional[Tuple[int, int]] = None, progress_callback: Optional[Callable[[int, int], None]] = None) Dict[str, Any]
Like get_contents_to_file() but writes to an open file-like object.
- get_file_size(key: str) int
Returns an int indicating the size of the file in bytes
- get_or_create_bucket(bucket_name: str) str
Deprecated: use create_object_store_if_needed() instead
- verify_object_storage() None
Perform read-only operations to verify the backing object store is available and accessible.
Raise Rohmu-specific error TransferObjectStoreInitializationError to abstract away implementation-specific details.
- class rohmu.object_storage.google.MediaStreamUpload(fd: BinaryIO, *, chunk_size: int, mime_type: str, name: str)
Support streaming arbitrary amount of data from non-seekable object supporting read method.
- chunksize() int
Chunk size for resumable uploads.
- Returns
Chunk size in bytes.
- getbytes(begin: int, length: int) bytes
Get bytes from the media.
- Parameters
begin – int, offset from beginning of file.
length – int, number of bytes to read, starting at begin.
- Returns
A string of bytes read. May be shorter than length if EOF was reached first.
- has_stream() bool
Does the underlying upload support a streaming interface.
Streaming means it is an io.IOBase subclass that supports seek, i.e. seekable() returns True.
- Returns
True if the call to stream() will return an instance of a seekable io.Base subclass.
- mimetype() str
Mime type of the body.
- Returns
Mime type.
- peek() None
try to top up some data into _next_chunk
- resumable() bool
Whether this upload is resumable.
- Returns
True if resumable upload or False.
- size() Optional[int]
Size of upload.
- Returns
Size of the body, or None of the size is unknown.
- stream() BinaryIO
A stream interface to the data being uploaded.
- Returns
The returned value is an io.IOBase subclass that supports seek, i.e. seekable() returns True.
- class rohmu.object_storage.google.MediaIoBaseDownloadWithByteRange(fd: BinaryIO, request: HttpRequest, chunksize: int = 52428800, *, byte_range: tuple[int, int])
This class is mostly a copy of the googleapiclient’s MediaIOBaseDownload class, but with the addition of the support for fetching a specific byte_range.
- next_chunk() tuple[googleapiclient.http.MediaDownloadProgress, bool]
Get the next chunk of the download.
- Returns
- The value of done will be True when the media has been fully
downloaded or the total size of the media is unknown.
- Return type
(status, done)
- Raises
googleapiclient.errors.HttpError if the response was not a 2xx (or a 416 is received and the file is empty) –
httplib2.HttpLib2Error if a transport error has occurred. –
Local
- pydantic model rohmu.object_storage.config.LocalObjectStorageConfig
Show JSON schema
{ "title": "LocalObjectStorageConfig", "type": "object", "properties": { "storage_type": { "title": "Storage Type", "default": "local", "enum": [ "local" ], "type": "string" }, "statsd_info": { "$ref": "#/definitions/StatsdConfig" }, "directory": { "title": "Directory", "type": "string", "format": "path" }, "prefix": { "title": "Prefix", "type": "string" } }, "required": [ "directory" ], "definitions": { "MessageFormat": { "title": "MessageFormat", "description": "An enumeration.", "enum": [ "datadog", "telegraf" ], "type": "string" }, "StatsdConfig": { "title": "StatsdConfig", "type": "object", "properties": { "host": { "title": "Host", "default": "127.0.0.1", "type": "string" }, "port": { "title": "Port", "default": 8125, "type": "integer" }, "message_format": { "default": "telegraf", "allOf": [ { "$ref": "#/definitions/MessageFormat" } ] }, "tags": { "title": "Tags", "default": {}, "type": "object", "additionalProperties": { "anyOf": [ { "type": "integer" }, { "type": "string" } ] } }, "operation_map": { "title": "Operation Map", "default": {}, "type": "object", "additionalProperties": { "type": "string" } } }, "additionalProperties": false } } }
- Config
arbitrary_types_allowed: bool = True
extra_forbid: bool = True
use_enum_values: bool = True
- Fields
storage_type (Literal[
- field directory: Path [Required]
- field prefix: Optional[str] = None
- field storage_type: Literal[StorageDriver.local] = StorageDriver.local
- class rohmu.object_storage.local.LocalTransfer(directory: Union[str, Path], prefix: Optional[str] = None, notifier: Optional[Notifier] = None, statsd_info: Optional[StatsdConfig] = None, ensure_object_store_available: bool = True)
- copy_file(*, source_key: str, destination_key: str, metadata: Optional[Dict[str, Any]] = None, **_kwargs: Any) None
Performs remote copy from source key name to destination key name. Key must identify a file, trees cannot be copied with this method. If no metadata is given copies the existing metadata.
- create_object_store_if_needed() None
No-op as there’s no need to create the directory ahead of time.
- delete_tree(key: str) None
Delete all keys under given root key. Basic implementation works by just listing all available keys and deleting them individually but storage providers can implement more efficient logic.
- get_contents_to_fileobj(key: str, fileobj_to_store_to: BinaryIO, *, byte_range: Optional[Tuple[int, int]] = None, progress_callback: Optional[Callable[[int, int], None]] = None) Dict[str, Any]
Like get_contents_to_file() but writes to an open file-like object.
- get_file_size(key: str) int
Returns an int indicating the size of the file in bytes
- verify_object_storage() None
No-op as there’s no need to check for the existence of the directory at setup time.
S3
- pydantic model rohmu.object_storage.config.S3ObjectStorageConfig
Show JSON schema
{ "title": "S3ObjectStorageConfig", "type": "object", "properties": { "storage_type": { "title": "Storage Type", "default": "s3", "enum": [ "s3" ], "type": "string" }, "statsd_info": { "$ref": "#/definitions/StatsdConfig" }, "region": { "title": "Region", "type": "string" }, "bucket_name": { "title": "Bucket Name", "type": "string" }, "aws_access_key_id": { "title": "Aws Access Key Id", "type": "string" }, "aws_secret_access_key": { "title": "Aws Secret Access Key", "type": "string" }, "prefix": { "title": "Prefix", "type": "string" }, "host": { "title": "Host", "type": "string" }, "port": { "title": "Port", "type": "string" }, "addressing_style": { "default": "path", "allOf": [ { "$ref": "#/definitions/S3AddressingStyle" } ] }, "is_secure": { "title": "Is Secure", "default": false, "type": "boolean" }, "is_verify_tls": { "title": "Is Verify Tls", "default": false, "type": "boolean" }, "cert_path": { "title": "Cert Path", "type": "string", "format": "path" }, "segment_size": { "title": "Segment Size", "default": 40894464, "type": "integer" }, "encrypted": { "title": "Encrypted", "default": false, "type": "boolean" }, "proxy_info": { "$ref": "#/definitions/ProxyInfo" }, "connect_timeout": { "title": "Connect Timeout", "type": "string" }, "read_timeout": { "title": "Read Timeout", "type": "string" }, "aws_session_token": { "title": "Aws Session Token", "type": "string" }, "use_dualstack_endpoint": { "title": "Use Dualstack Endpoint", "default": true, "type": "boolean" } }, "required": [ "region" ], "definitions": { "MessageFormat": { "title": "MessageFormat", "description": "An enumeration.", "enum": [ "datadog", "telegraf" ], "type": "string" }, "StatsdConfig": { "title": "StatsdConfig", "type": "object", "properties": { "host": { "title": "Host", "default": "127.0.0.1", "type": "string" }, "port": { "title": "Port", "default": 8125, "type": "integer" }, "message_format": { "default": "telegraf", "allOf": [ { "$ref": "#/definitions/MessageFormat" } ] }, "tags": { "title": "Tags", "default": {}, "type": "object", "additionalProperties": { "anyOf": [ { "type": "integer" }, { "type": "string" } ] } }, "operation_map": { "title": "Operation Map", "default": {}, "type": "object", "additionalProperties": { "type": "string" } } }, "additionalProperties": false }, "S3AddressingStyle": { "title": "S3AddressingStyle", "description": "An enumeration.", "enum": [ "auto", "path", "virtual" ] }, "ProxyType": { "title": "ProxyType", "description": "An enumeration.", "enum": [ "socks5", "http" ], "type": "string" }, "ProxyInfo": { "title": "ProxyInfo", "type": "object", "properties": { "host": { "title": "Host", "type": "string" }, "port": { "title": "Port", "type": "integer" }, "type": { "$ref": "#/definitions/ProxyType" }, "user": { "title": "User", "type": "string" }, "pass": { "title": "Pass", "type": "string" } }, "required": [ "host", "port", "type" ], "additionalProperties": false } } }
- Config
arbitrary_types_allowed: bool = True
extra_forbid: bool = True
use_enum_values: bool = True
- Fields
addressing_style (rohmu.object_storage.config.S3AddressingStyle)
storage_type (Literal[
- field addressing_style: S3AddressingStyle = S3AddressingStyle.path
- Validated by
- field aws_access_key_id: Optional[str] = None
- Validated by
- field aws_secret_access_key: Optional[str] = None
- Validated by
- field aws_session_token: Optional[str] = None
- Validated by
- field bucket_name: Optional[str] = None
- Validated by
- field cert_path: Optional[Path] = None
- Validated by
- field connect_timeout: Optional[str] = None
- Validated by
- field encrypted: bool = False
- Validated by
- field host: Optional[str] = None
- Validated by
- field is_secure: bool = False
- Validated by
- field is_verify_tls: bool = False
- Validated by
- field port: Optional[str] = None
- Validated by
- field prefix: Optional[str] = None
- Validated by
- field read_timeout: Optional[str] = None
- Validated by
- field region: str [Required]
- Validated by
- field segment_size: int = 40894464
- Validated by
- field storage_type: Literal[StorageDriver.s3] = StorageDriver.s3
- Validated by
- field use_dualstack_endpoint: Optional[bool] = True
- Validated by
- validator validate_is_verify_tls_and_cert_path » all fields
- class rohmu.object_storage.s3.S3Transfer(region: str, bucket_name: str, aws_access_key_id: Optional[str] = None, aws_secret_access_key: Optional[str] = None, prefix: Optional[str] = None, host: Optional[str] = None, port: Optional[int] = None, addressing_style: S3AddressingStyle = S3AddressingStyle.path, is_secure: bool = False, is_verify_tls: bool = False, cert_path: Optional[Path] = None, segment_size: int = 40894464, encrypted: bool = False, proxy_info: Optional[dict[str, Union[str, int]]] = None, connect_timeout: Optional[float] = None, read_timeout: Optional[float] = None, notifier: Optional[Notifier] = None, aws_session_token: Optional[str] = None, use_dualstack_endpoint: Optional[bool] = True, statsd_info: Optional[StatsdConfig] = None, ensure_object_store_available: bool = True)
- close() None
Release all resources associated with the Transfer object.
- copy_file(*, source_key: str, destination_key: str, metadata: Optional[Dict[str, Any]] = None, **_kwargs: Any) None
Performs remote copy from source key name to destination key name. Key must identify a file, trees cannot be copied with this method. If no metadata is given copies the existing metadata.
- create_object_store_if_needed() None
Create the backing object store if it’s needed (e.g. creating directories, buckets, etc.).
Raise Rohmu-specific error TransferObjectStoreInitializationError to abstract away implementation-specific details.
- delete_keys(keys: Collection[str]) None
Delete specified keys
- get_contents_to_fileobj(key: str, fileobj_to_store_to: BinaryIO, *, byte_range: Optional[Tuple[int, int]] = None, progress_callback: Optional[Callable[[int, int], None]] = None) Dict[str, Any]
Like get_contents_to_file() but writes to an open file-like object.
- get_file_size(key: str) int
Returns an int indicating the size of the file in bytes
- upload_concurrent_chunk(upload: ConcurrentUpload, chunk_number: int, fd: BinaryIO, upload_progress_fn: Optional[Callable[[int], None]] = None) None
Synchronously uploads a chunk. Returns an ETag for the uploaded chunk. This method is thread-safe, so you can call it concurrently from multiple threads to upload different chunks. What happens if multiple threads try to upload the same chunk_number concurrently is unspecified.
- verify_object_storage() None
Perform read-only operations to verify the backing object store is available and accessible.
Raise Rohmu-specific error TransferObjectStoreInitializationError to abstract away implementation-specific details.
SFTP
- pydantic model rohmu.object_storage.config.SFTPObjectStorageConfig
Show JSON schema
{ "title": "SFTPObjectStorageConfig", "type": "object", "properties": { "storage_type": { "title": "Storage Type", "default": "sftp", "enum": [ "sftp" ], "type": "string" }, "statsd_info": { "$ref": "#/definitions/StatsdConfig" }, "server": { "title": "Server", "type": "string" }, "port": { "title": "Port", "type": "integer" }, "username": { "title": "Username", "type": "string" }, "password": { "title": "Password", "type": "string" }, "private_key": { "title": "Private Key", "type": "string" }, "prefix": { "title": "Prefix", "type": "string" } }, "required": [ "server", "port", "username" ], "definitions": { "MessageFormat": { "title": "MessageFormat", "description": "An enumeration.", "enum": [ "datadog", "telegraf" ], "type": "string" }, "StatsdConfig": { "title": "StatsdConfig", "type": "object", "properties": { "host": { "title": "Host", "default": "127.0.0.1", "type": "string" }, "port": { "title": "Port", "default": 8125, "type": "integer" }, "message_format": { "default": "telegraf", "allOf": [ { "$ref": "#/definitions/MessageFormat" } ] }, "tags": { "title": "Tags", "default": {}, "type": "object", "additionalProperties": { "anyOf": [ { "type": "integer" }, { "type": "string" } ] } }, "operation_map": { "title": "Operation Map", "default": {}, "type": "object", "additionalProperties": { "type": "string" } } }, "additionalProperties": false } } }
- Config
arbitrary_types_allowed: bool = True
extra_forbid: bool = True
use_enum_values: bool = True
- Fields
storage_type (Literal[
- field password: Optional[str] = None
- field port: int [Required]
- field prefix: Optional[str] = None
- field private_key: Optional[str] = None
- field server: str [Required]
- field storage_type: Literal[StorageDriver.sftp] = StorageDriver.sftp
- field username: str [Required]
- class rohmu.object_storage.sftp.SFTPTransfer(server: str, port: int, username: str, password: Optional[str] = None, private_key: Optional[str] = None, prefix: Optional[str] = None, notifier: Optional[Notifier] = None, statsd_info: Optional[StatsdConfig] = None, ensure_object_store_available: bool = True)
- copy_file(*, source_key: str, destination_key: str, metadata: Optional[Dict[str, Any]] = None, **_kwargs: Any) None
Performs remote copy from source key name to destination key name. Key must identify a file, trees cannot be copied with this method. If no metadata is given copies the existing metadata.
- create_object_store_if_needed() None
No-op as it’s not applicable to SFTP transfers
- get_contents_to_fileobj(key: str, fileobj_to_store_to: BinaryIO, *, byte_range: Optional[Tuple[int, int]] = None, progress_callback: Optional[Callable[[int, int], None]] = None) Dict[str, Any]
Like get_contents_to_file() but writes to an open file-like object.
- get_file_size(key: str) int
Returns an int indicating the size of the file in bytes
- verify_object_storage() None
No-op for now. Eventually, the SFTP connection could be tested here instead of in the constructor.
Swift
- pydantic model rohmu.object_storage.config.SwiftObjectStorageConfig
Show JSON schema
{ "title": "SwiftObjectStorageConfig", "type": "object", "properties": { "storage_type": { "title": "Storage Type", "default": "swift", "enum": [ "swift" ], "type": "string" }, "statsd_info": { "$ref": "#/definitions/StatsdConfig" }, "user": { "title": "User", "type": "string" }, "key": { "title": "Key", "type": "string" }, "container_name": { "title": "Container Name", "type": "string" }, "auth_url": { "title": "Auth Url", "type": "string" }, "auth_version": { "title": "Auth Version", "default": "2.0", "type": "string" }, "tenant_name": { "title": "Tenant Name", "type": "string" }, "segment_size": { "title": "Segment Size", "default": 3221225472, "type": "integer" }, "region_name": { "title": "Region Name", "type": "string" }, "user_id": { "title": "User Id", "type": "string" }, "user_domain_id": { "title": "User Domain Id", "type": "string" }, "user_domain_name": { "title": "User Domain Name", "type": "string" }, "tenant_id": { "title": "Tenant Id", "type": "string" }, "project_id": { "title": "Project Id", "type": "string" }, "project_name": { "title": "Project Name", "type": "string" }, "project_domain_id": { "title": "Project Domain Id", "type": "string" }, "project_domain_name": { "title": "Project Domain Name", "type": "string" }, "service_type": { "title": "Service Type", "type": "string" }, "endpoint_type": { "title": "Endpoint Type", "type": "string" }, "prefix": { "title": "Prefix", "type": "string" } }, "required": [ "user", "key", "container_name", "auth_url" ], "definitions": { "MessageFormat": { "title": "MessageFormat", "description": "An enumeration.", "enum": [ "datadog", "telegraf" ], "type": "string" }, "StatsdConfig": { "title": "StatsdConfig", "type": "object", "properties": { "host": { "title": "Host", "default": "127.0.0.1", "type": "string" }, "port": { "title": "Port", "default": 8125, "type": "integer" }, "message_format": { "default": "telegraf", "allOf": [ { "$ref": "#/definitions/MessageFormat" } ] }, "tags": { "title": "Tags", "default": {}, "type": "object", "additionalProperties": { "anyOf": [ { "type": "integer" }, { "type": "string" } ] } }, "operation_map": { "title": "Operation Map", "default": {}, "type": "object", "additionalProperties": { "type": "string" } } }, "additionalProperties": false } } }
- Config
arbitrary_types_allowed: bool = True
extra_forbid: bool = True
use_enum_values: bool = True
- Fields
storage_type (Literal[
- field auth_url: str [Required]
- field auth_version: str = '2.0'
- field container_name: str [Required]
- field endpoint_type: Optional[str] = None
- field key: str [Required]
- field prefix: Optional[str] = None
- field project_domain_id: Optional[str] = None
- field project_domain_name: Optional[str] = None
- field project_id: Optional[str] = None
- field project_name: Optional[str] = None
- field region_name: Optional[str] = None
- field segment_size: int = 3221225472
- field service_type: Optional[str] = None
- field storage_type: Literal[StorageDriver.swift] = StorageDriver.swift
- field tenant_id: Optional[str] = None
- field tenant_name: Optional[str] = None
- field user: str [Required]
- field user_domain_id: Optional[str] = None
- field user_domain_name: Optional[str] = None
- field user_id: Optional[str] = None
- class rohmu.object_storage.swift.SwiftTransfer(*, user: str, key: str, container_name: str, auth_url: str, auth_version: str = '2.0', tenant_name: Optional[str] = None, prefix: Optional[str] = None, segment_size: int = 3221225472, region_name: Optional[str] = None, user_id: Optional[str] = None, user_domain_id: Optional[str] = None, user_domain_name: Optional[str] = None, tenant_id: Optional[str] = None, project_id: Optional[str] = None, project_name: Optional[str] = None, project_domain_id: Optional[str] = None, project_domain_name: Optional[str] = None, service_type: Optional[str] = None, endpoint_type: Optional[str] = None, notifier: Optional[Notifier] = None, statsd_info: Optional[StatsdConfig] = None, ensure_object_store_available: bool = True)
- copy_file(*, source_key: str, destination_key: str, metadata: Optional[Dict[str, Any]] = None, **_kwargs: Any) None
Performs remote copy from source key name to destination key name. Key must identify a file, trees cannot be copied with this method. If no metadata is given copies the existing metadata.
- create_object_store_if_needed() None
Create the backing object store if it’s needed (e.g. creating directories, buckets, etc.).
Raise Rohmu-specific error TransferObjectStoreInitializationError to abstract away implementation-specific details.
- get_contents_to_fileobj(key: str, fileobj_to_store_to: BinaryIO, *, byte_range: Optional[Tuple[int, int]] = None, progress_callback: Optional[Callable[[int, int], None]] = None) Dict[str, Any]
Like get_contents_to_file() but writes to an open file-like object.
- get_file_size(key: str) int
Returns an int indicating the size of the file in bytes
- verify_object_storage() None
Perform read-only operations to verify the backing object store is available and accessible.
Raise Rohmu-specific error TransferObjectStoreInitializationError to abstract away implementation-specific details.