b2sdk.raw_simulator – B2 raw api simulator

b2sdk.raw_simulator.get_bytes_range(data_bytes, bytes_range)[source]

Slice bytes array using bytes range

class b2sdk.raw_simulator.KeySimulator(account_id, name, application_key_id, key, capabilities, expiration_timestamp_or_none, bucket_id_or_none, bucket_name_or_none, name_prefix_or_none)[source]

Bases: object

Hold information about one application key, which can be either a master application key, or one created with create_key().

__init__(account_id, name, application_key_id, key, capabilities, expiration_timestamp_or_none, bucket_id_or_none, bucket_name_or_none, name_prefix_or_none)[source]
as_key()[source]
as_created_key()[source]

Return the dict returned by b2_create_key.

This is just like the one for b2_list_keys, but also includes the secret key.

get_allowed()[source]

Return the ‘allowed’ structure to include in the response from b2_authorize_account.

class b2sdk.raw_simulator.PartSimulator(file_id, part_number, content_length, content_sha1, part_data)[source]

Bases: object

__init__(file_id, part_number, content_length, content_sha1, part_data)[source]
as_list_parts_dict()[source]
class b2sdk.raw_simulator.FileSimulator(account_id, bucket, file_id, action, name, content_type, content_sha1, file_info, data_bytes, upload_timestamp, range_=None, server_side_encryption=None, file_retention=None, legal_hold=LegalHold.UNSET, replication_status=None)[source]

Bases: object

One of three: an unfinished large file, a finished file, or a deletion marker.

Parameters:
CHECK_ENCRYPTION = True
SPECIAL_FILE_INFOS = {'b2-cache-control': 'Cache-Control', 'b2-content-disposition': 'Content-Disposition', 'b2-content-encoding': 'Content-Encoding', 'b2-content-language': 'Content-Language', 'b2-expires': 'Expires'}
__init__(account_id, bucket, file_id, action, name, content_type, content_sha1, file_info, data_bytes, upload_timestamp, range_=None, server_side_encryption=None, file_retention=None, legal_hold=LegalHold.UNSET, replication_status=None)[source]
Parameters:
classmethod dont_check_encryption()[source]
sort_key()[source]

Return a key that can be used to sort the files in a bucket in the order that b2_list_file_versions returns them.

as_download_headers(account_auth_token_or_none, range_=None)[source]
as_upload_result(account_auth_token)[source]
as_list_files_dict(account_auth_token)[source]
is_allowed_to_read_file_retention(account_auth_token)[source]
as_start_large_file_result(account_auth_token)[source]
add_part(part_number, part)[source]
finish(part_sha1_array)[source]
is_visible()[source]

Does this file show up in b2_list_file_names?

list_parts(start_part_number, max_part_count)[source]
check_encryption(request_encryption)[source]
Parameters:

request_encryption (Optional[EncryptionSetting]) –

class b2sdk.raw_simulator.FakeRequest(url, headers)

Bases: tuple

headers

Alias for field number 1

url

Alias for field number 0

class b2sdk.raw_simulator.FakeResponse(account_auth_token_or_none, file_sim, url, range_=None)[source]

Bases: object

__init__(account_auth_token_or_none, file_sim, url, range_=None)[source]
iter_content(chunk_size=1)[source]
property request
close()[source]
class b2sdk.raw_simulator.BucketSimulator(api, account_id, bucket_id, bucket_name, bucket_type, bucket_info=None, cors_rules=None, lifecycle_rules=None, options_set=None, default_server_side_encryption=None, is_file_lock_enabled=None, replication=None)[source]

Bases: object

Parameters:
FIRST_FILE_NUMBER = 9999
FIRST_FILE_ID = '9999'
FILE_SIMULATOR_CLASS

alias of FileSimulator

RESPONSE_CLASS

alias of FakeResponse

MAX_SIMPLE_COPY_SIZE = 200
__init__(api, account_id, bucket_id, bucket_name, bucket_type, bucket_info=None, cors_rules=None, lifecycle_rules=None, options_set=None, default_server_side_encryption=None, is_file_lock_enabled=None, replication=None)[source]
Parameters:
get_file(file_id, file_name)[source]
Return type:

FileSimulator

is_allowed_to_read_bucket_encryption_setting(account_auth_token)[source]
is_allowed_to_read_bucket_retention(account_auth_token)[source]
bucket_dict(account_auth_token)[source]
cancel_large_file(file_id)[source]
delete_file_version(account_auth_token, file_id, file_name, bypass_governance=False)[source]
Parameters:

bypass_governance (bool) –

download_file_by_id(account_auth_token_or_none, file_id, url, range_=None, encryption=None)[source]
Parameters:

encryption (Optional[EncryptionSetting]) –

download_file_by_name(account_auth_token_or_none, file_name, url, range_=None, encryption=None)[source]
Parameters:

encryption (Optional[EncryptionSetting]) –

finish_large_file(account_auth_token, file_id, part_sha1_array)[source]
get_file_info_by_id(account_auth_token, file_id)[source]
get_file_info_by_name(account_auth_token, file_name)[source]
get_upload_url(account_auth_token)[source]
get_upload_part_url(account_auth_token, file_id)[source]
hide_file(account_auth_token, file_name)[source]
update_file_retention(account_auth_token, file_id, file_name, file_retention, bypass_governance=False)[source]
Parameters:
Parameters:

legal_hold (LegalHold) –

copy_file(account_auth_token, file_id, new_file_name, bytes_range=None, metadata_directive=None, content_type=None, file_info=None, destination_bucket_id=None, destination_server_side_encryption=None, source_server_side_encryption=None, file_retention=None, legal_hold=None)[source]
Parameters:
list_file_names(account_auth_token, start_file_name=None, max_file_count=None, prefix=None)[source]
list_file_versions(account_auth_token, start_file_name=None, start_file_id=None, max_file_count=None, prefix=None)[source]
list_parts(file_id, start_part_number, max_part_count)[source]
list_unfinished_large_files(account_auth_token, start_file_id=None, max_file_count=None, prefix=None)[source]
start_large_file(account_auth_token, file_name, content_type, file_info, server_side_encryption=None, file_retention=None, legal_hold=None, custom_upload_timestamp=None)[source]
Parameters:
upload_file(upload_id, upload_auth_token, file_name, content_length, content_type, content_sha1, file_info, data_stream, server_side_encryption=None, file_retention=None, legal_hold=None, custom_upload_timestamp=None)[source]
Parameters:
upload_part(file_id, part_number, content_length, sha1_sum, input_stream, server_side_encryption=None)[source]
Parameters:

server_side_encryption (Optional[EncryptionSetting]) –

class b2sdk.raw_simulator.RawSimulator(b2_http=None)[source]

Bases: AbstractRawApi

Implement the same interface as B2RawHTTPApi by simulating all of the calls and keeping state in memory.

The intended use for this class is for unit tests that test things built on top of B2RawHTTPApi.

BUCKET_SIMULATOR_CLASS

alias of BucketSimulator

API_URL = 'http://api.example.com'
S3_API_URL = 'http://s3.api.example.com'
DOWNLOAD_URL = 'http://download.example.com'
MIN_PART_SIZE = 200
MAX_PART_ID = 10000
MAX_DURATION_IN_SECONDS = 86400000
UPLOAD_PART_MATCHER = re.compile('https://upload.example.com/part/([^/]*)')
UPLOAD_URL_MATCHER = re.compile('https://upload.example.com/([^/]*)/([^/]*)')
DOWNLOAD_URL_MATCHER = re.compile('http://download.example.com(?:/b2api/v[0-9]+/b2_download_file_by_id\\?fileId=(?P<file_id>[^/]+)|/file/(?P<bucket_name>[^/]+)/(?P<file_name>.+))$')
__init__(b2_http=None)[source]
expire_auth_token(auth_token)[source]

Simulate the auth token expiring.

The next call that tries to use this auth token will get an auth_token_expired error.

create_account()[source]

Return (accountId, masterApplicationKey) for a newly created account.

set_upload_errors(errors)[source]

Store a sequence of exceptions to raise on upload. Each one will be raised in turn, until they are all gone. Then the next upload will succeed.

authorize_account(realm_url, application_key_id, application_key)[source]
cancel_large_file(api_url, account_auth_token, file_id)[source]
create_bucket(api_url, account_auth_token, account_id, bucket_name, bucket_type, bucket_info=None, cors_rules=None, lifecycle_rules=None, default_server_side_encryption=None, is_file_lock_enabled=None, replication=None)[source]
Parameters:
create_key(api_url, account_auth_token, account_id, capabilities, key_name, valid_duration_seconds, bucket_id, name_prefix)[source]
delete_file_version(api_url, account_auth_token, file_id, file_name, bypass_governance=False)[source]
Parameters:

bypass_governance (bool) –

update_file_retention(api_url, account_auth_token, file_id, file_name, file_retention, bypass_governance=False)[source]
Parameters:
Parameters:

legal_hold (bool) –

delete_bucket(api_url, account_auth_token, account_id, bucket_id)[source]
download_file_from_url(account_auth_token_or_none, url, range_=None, encryption=None)[source]
Parameters:

encryption (Optional[EncryptionSetting]) –

delete_key(api_url, account_auth_token, application_key_id)[source]
finish_large_file(api_url, account_auth_token, file_id, part_sha1_array)[source]
get_download_authorization(api_url, account_auth_token, bucket_id, file_name_prefix, valid_duration_in_seconds)[source]
get_file_info_by_id(api_url, account_auth_token, file_id)[source]
get_file_info_by_name(api_url, account_auth_token, bucket_name, file_name)[source]
get_upload_url(api_url, account_auth_token, bucket_id)[source]
get_upload_part_url(api_url, account_auth_token, file_id)[source]
hide_file(api_url, account_auth_token, bucket_id, file_name)[source]
copy_file(api_url, account_auth_token, source_file_id, new_file_name, bytes_range=None, metadata_directive=None, content_type=None, file_info=None, destination_bucket_id=None, destination_server_side_encryption=None, source_server_side_encryption=None, file_retention=None, legal_hold=None)[source]
Parameters:
copy_part(api_url, account_auth_token, source_file_id, large_file_id, part_number, bytes_range=None, destination_server_side_encryption=None, source_server_side_encryption=None)[source]
Parameters:
list_buckets(api_url, account_auth_token, account_id, bucket_id=None, bucket_name=None)[source]
list_file_names(api_url, account_auth_token, bucket_id, start_file_name=None, max_file_count=None, prefix=None)[source]
list_file_versions(api_url, account_auth_token, bucket_id, start_file_name=None, start_file_id=None, max_file_count=None, prefix=None)[source]
list_keys(api_url, account_auth_token, account_id, max_key_count=1000, start_application_key_id=None)[source]
list_parts(api_url, account_auth_token, file_id, start_part_number, max_part_count)[source]
list_unfinished_large_files(api_url, account_auth_token, bucket_id, start_file_id=None, max_file_count=None, prefix=None)[source]
start_large_file(api_url, account_auth_token, bucket_id, file_name, content_type, file_info, server_side_encryption=None, file_retention=None, legal_hold=None, custom_upload_timestamp=None)[source]
Parameters:
update_bucket(api_url, account_auth_token, account_id, bucket_id, bucket_type=None, bucket_info=None, cors_rules=None, lifecycle_rules=None, if_revision_is=None, default_server_side_encryption=None, default_retention=None, replication=None, is_file_lock_enabled=None)[source]
Parameters:
classmethod get_upload_file_headers(upload_auth_token, file_name, content_length, content_type, content_sha1, file_info, server_side_encryption, file_retention, legal_hold, custom_upload_timestamp=None)[source]
Parameters:
Return type:

dict

upload_file(upload_url, upload_auth_token, file_name, content_length, content_type, content_sha1, file_info, data_stream, server_side_encryption=None, file_retention=None, legal_hold=None, custom_upload_timestamp=None)[source]
Parameters:
upload_part(upload_url, upload_auth_token, part_number, content_length, sha1_sum, input_stream, server_side_encryption=None)[source]
Parameters:

server_side_encryption (Optional[EncryptionSetting]) –