s3backed#

Store big binary or text in S3 and store the s3 object path in DynamoDB.

class pynamodb_mate.attributes.s3backed.S3BackedAttribute(bucket_name: str, key_template: str = 'pynamodb-mate/s3backed/{fingerprint}', compressed: bool = True, s3_client: Optional[BaseClient] = None, hash_key: bool = False, range_key: bool = False, null: Optional[bool] = None, default: Optional[Callable] = None, default_for_new: Optional[Callable] = None, attr_name: Optional[str] = None)[source]#

S3BackedAttribute will store the original data on S3 and store the S3 uri in DynamoDB.

TODO: add automatically rollback if one of the DynamoDB write or S3 write failed.

TODO: add an option to delete the S3 object as well when the DynamoDB item is deleted.

Parameters:
  • bucket_name

  • key_template – the template of the S3 key. It must contain “{fingerprint}”, and the {fingerprint} is the sha256 of the data

  • compressed – whether to compress the data before storing it on S3.

  • s3_client – the boto3 s3 client to read/write data to S3, it is useful if you want to use multiple s3 bucket that with different AWS credential.

Other parameters are the same as pynamodb.attributes.UnicodeAttribute.

user_serializer(value: Any) bytes[source]#

Implement this method to define how you want to convert your data to binary.

user_deserializer(value: bytes) Any[source]#

Implement this method to define how you want to recover your data from binary.

serialize(value: Any) str[source]#

Take the original value, store it on S3 and return the S3 uri.

deserialize(value: str) Any[source]#

Take the S3 uri, get the original value from S3.

class pynamodb_mate.attributes.s3backed.S3BackedBigBinaryAttribute(bucket_name: str, key_template: str = 'pynamodb-mate/s3backed/{fingerprint}', compressed: bool = True, s3_client: Optional[BaseClient] = None, hash_key: bool = False, range_key: bool = False, null: Optional[bool] = None, default: Optional[Callable] = None, default_for_new: Optional[Callable] = None, attr_name: Optional[str] = None)[source]#

An attribute that store big binary dict in S3 and store the S3 uri in DynamoDB.

user_serializer(value: bytes) bytes[source]#

Implement this method to define how you want to convert your data to binary.

user_deserializer(value: bytes) bytes[source]#

Implement this method to define how you want to recover your data from binary.

class pynamodb_mate.attributes.s3backed.S3BackedBigTextAttribute(bucket_name: str, key_template: str = 'pynamodb-mate/s3backed/{fingerprint}', compressed: bool = True, s3_client: Optional[BaseClient] = None, hash_key: bool = False, range_key: bool = False, null: Optional[bool] = None, default: Optional[Callable] = None, default_for_new: Optional[Callable] = None, attr_name: Optional[str] = None)[source]#

An attribute that store big text dict in S3 and store the S3 uri in DynamoDB.

user_serializer(value: str) bytes[source]#

Implement this method to define how you want to convert your data to binary.

user_deserializer(value: bytes) str[source]#

Implement this method to define how you want to recover your data from binary.

class pynamodb_mate.attributes.s3backed.S3BackedJsonDictAttribute(bucket_name: str, key_template: str = 'pynamodb-mate/s3backed/{fingerprint}', compressed: bool = True, s3_client: Optional[BaseClient] = None, hash_key: bool = False, range_key: bool = False, null: Optional[bool] = None, default: Optional[Callable] = None, default_for_new: Optional[Callable] = None, attr_name: Optional[str] = None)[source]#

An attribute that store big json dict in S3 and store the S3 uri in DynamoDB.

user_serializer(value: dict) bytes[source]#

Implement this method to define how you want to convert your data to binary.

user_deserializer(value: bytes) dict[source]#

Implement this method to define how you want to recover your data from binary.