repo stringlengths 7 55 | path stringlengths 4 127 | func_name stringlengths 1 88 | original_string stringlengths 75 19.8k | language stringclasses 1
value | code stringlengths 75 19.8k | code_tokens list | docstring stringlengths 3 17.3k | docstring_tokens list | sha stringlengths 40 40 | url stringlengths 87 242 | partition stringclasses 1
value |
|---|---|---|---|---|---|---|---|---|---|---|---|
Azure/azure-cosmos-table-python | azure-cosmosdb-table/azure/cosmosdb/table/tableservice.py | TableService.get_table_service_stats | def get_table_service_stats(self, timeout=None):
'''
Retrieves statistics related to replication for the Table service. It is
only available when read-access geo-redundant replication is enabled for
the storage account.
With geo-redundant replication, Azure Storage maintains y... | python | def get_table_service_stats(self, timeout=None):
'''
Retrieves statistics related to replication for the Table service. It is
only available when read-access geo-redundant replication is enabled for
the storage account.
With geo-redundant replication, Azure Storage maintains y... | [
"def",
"get_table_service_stats",
"(",
"self",
",",
"timeout",
"=",
"None",
")",
":",
"request",
"=",
"HTTPRequest",
"(",
")",
"request",
".",
"method",
"=",
"'GET'",
"request",
".",
"host_locations",
"=",
"self",
".",
"_get_host_locations",
"(",
"primary",
... | Retrieves statistics related to replication for the Table service. It is
only available when read-access geo-redundant replication is enabled for
the storage account.
With geo-redundant replication, Azure Storage maintains your data durable
in two locations. In both locations, Azure ... | [
"Retrieves",
"statistics",
"related",
"to",
"replication",
"for",
"the",
"Table",
"service",
".",
"It",
"is",
"only",
"available",
"when",
"read",
"-",
"access",
"geo",
"-",
"redundant",
"replication",
"is",
"enabled",
"for",
"the",
"storage",
"account",
"."
] | a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0 | https://github.com/Azure/azure-cosmos-table-python/blob/a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0/azure-cosmosdb-table/azure/cosmosdb/table/tableservice.py#L335-L369 | train |
Azure/azure-cosmos-table-python | azure-cosmosdb-table/azure/cosmosdb/table/tableservice.py | TableService.get_table_service_properties | def get_table_service_properties(self, timeout=None):
'''
Gets the properties of a storage account's Table service, including
logging, analytics and CORS rules.
:param int timeout:
The server timeout, expressed in seconds.
:return: The table service properties.
... | python | def get_table_service_properties(self, timeout=None):
'''
Gets the properties of a storage account's Table service, including
logging, analytics and CORS rules.
:param int timeout:
The server timeout, expressed in seconds.
:return: The table service properties.
... | [
"def",
"get_table_service_properties",
"(",
"self",
",",
"timeout",
"=",
"None",
")",
":",
"request",
"=",
"HTTPRequest",
"(",
")",
"request",
".",
"method",
"=",
"'GET'",
"request",
".",
"host_locations",
"=",
"self",
".",
"_get_host_locations",
"(",
"seconda... | Gets the properties of a storage account's Table service, including
logging, analytics and CORS rules.
:param int timeout:
The server timeout, expressed in seconds.
:return: The table service properties.
:rtype: :class:`~azure.storage.common.models.ServiceProperties` | [
"Gets",
"the",
"properties",
"of",
"a",
"storage",
"account",
"s",
"Table",
"service",
"including",
"logging",
"analytics",
"and",
"CORS",
"rules",
"."
] | a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0 | https://github.com/Azure/azure-cosmos-table-python/blob/a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0/azure-cosmosdb-table/azure/cosmosdb/table/tableservice.py#L371-L391 | train |
Azure/azure-cosmos-table-python | azure-cosmosdb-table/azure/cosmosdb/table/tableservice.py | TableService.delete_table | def delete_table(self, table_name, fail_not_exist=False, timeout=None):
'''
Deletes the specified table and any data it contains.
When a table is successfully deleted, it is immediately marked for deletion
and is no longer accessible to clients. The table is later removed from
... | python | def delete_table(self, table_name, fail_not_exist=False, timeout=None):
'''
Deletes the specified table and any data it contains.
When a table is successfully deleted, it is immediately marked for deletion
and is no longer accessible to clients. The table is later removed from
... | [
"def",
"delete_table",
"(",
"self",
",",
"table_name",
",",
"fail_not_exist",
"=",
"False",
",",
"timeout",
"=",
"None",
")",
":",
"_validate_not_none",
"(",
"'table_name'",
",",
"table_name",
")",
"request",
"=",
"HTTPRequest",
"(",
")",
"request",
".",
"me... | Deletes the specified table and any data it contains.
When a table is successfully deleted, it is immediately marked for deletion
and is no longer accessible to clients. The table is later removed from
the Table service during garbage collection.
Note that deleting a table is likely ... | [
"Deletes",
"the",
"specified",
"table",
"and",
"any",
"data",
"it",
"contains",
"."
] | a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0 | https://github.com/Azure/azure-cosmos-table-python/blob/a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0/azure-cosmosdb-table/azure/cosmosdb/table/tableservice.py#L571-L611 | train |
Azure/azure-cosmos-table-python | azure-cosmosdb-table/azure/cosmosdb/table/tableservice.py | TableService.query_entities | def query_entities(self, table_name, filter=None, select=None, num_results=None,
marker=None, accept=TablePayloadFormat.JSON_MINIMAL_METADATA,
property_resolver=None, timeout=None):
'''
Returns a generator to list the entities in the table specified. The
... | python | def query_entities(self, table_name, filter=None, select=None, num_results=None,
marker=None, accept=TablePayloadFormat.JSON_MINIMAL_METADATA,
property_resolver=None, timeout=None):
'''
Returns a generator to list the entities in the table specified. The
... | [
"def",
"query_entities",
"(",
"self",
",",
"table_name",
",",
"filter",
"=",
"None",
",",
"select",
"=",
"None",
",",
"num_results",
"=",
"None",
",",
"marker",
"=",
"None",
",",
"accept",
"=",
"TablePayloadFormat",
".",
"JSON_MINIMAL_METADATA",
",",
"proper... | Returns a generator to list the entities in the table specified. The
generator will lazily follow the continuation tokens returned by the
service and stop when all entities have been returned or num_results is
reached.
If num_results is specified and the account has more than that num... | [
"Returns",
"a",
"generator",
"to",
"list",
"the",
"entities",
"in",
"the",
"table",
"specified",
".",
"The",
"generator",
"will",
"lazily",
"follow",
"the",
"continuation",
"tokens",
"returned",
"by",
"the",
"service",
"and",
"stop",
"when",
"all",
"entities",... | a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0 | https://github.com/Azure/azure-cosmos-table-python/blob/a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0/azure-cosmosdb-table/azure/cosmosdb/table/tableservice.py#L678-L740 | train |
Azure/azure-cosmos-table-python | azure-cosmosdb-table/azure/cosmosdb/table/tableservice.py | TableService.merge_entity | def merge_entity(self, table_name, entity, if_match='*', timeout=None):
'''
Updates an existing entity by merging the entity's properties. Throws
if the entity does not exist.
This operation does not replace the existing entity as the update_entity
operation does. A pr... | python | def merge_entity(self, table_name, entity, if_match='*', timeout=None):
'''
Updates an existing entity by merging the entity's properties. Throws
if the entity does not exist.
This operation does not replace the existing entity as the update_entity
operation does. A pr... | [
"def",
"merge_entity",
"(",
"self",
",",
"table_name",
",",
"entity",
",",
"if_match",
"=",
"'*'",
",",
"timeout",
"=",
"None",
")",
":",
"_validate_not_none",
"(",
"'table_name'",
",",
"table_name",
")",
"request",
"=",
"_merge_entity",
"(",
"entity",
",",
... | Updates an existing entity by merging the entity's properties. Throws
if the entity does not exist.
This operation does not replace the existing entity as the update_entity
operation does. A property cannot be removed with merge_entity.
Any properties with null values... | [
"Updates",
"an",
"existing",
"entity",
"by",
"merging",
"the",
"entity",
"s",
"properties",
".",
"Throws",
"if",
"the",
"entity",
"does",
"not",
"exist",
".",
"This",
"operation",
"does",
"not",
"replace",
"the",
"existing",
"entity",
"as",
"the",
"update_en... | a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0 | https://github.com/Azure/azure-cosmos-table-python/blob/a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0/azure-cosmosdb-table/azure/cosmosdb/table/tableservice.py#L969-L1008 | train |
Azure/azure-cosmos-table-python | azure-cosmosdb-table/samples/table/table_usage.py | TableSamples.create_entity_class | def create_entity_class(self):
'''
Creates a class-based entity with fixed values, using all of the supported data types.
'''
entity = Entity()
# Partition key and row key must be strings and are required
entity.PartitionKey = 'pk{}'.format(str(uuid.uuid4()).replace('-',... | python | def create_entity_class(self):
'''
Creates a class-based entity with fixed values, using all of the supported data types.
'''
entity = Entity()
# Partition key and row key must be strings and are required
entity.PartitionKey = 'pk{}'.format(str(uuid.uuid4()).replace('-',... | [
"def",
"create_entity_class",
"(",
"self",
")",
":",
"entity",
"=",
"Entity",
"(",
")",
"# Partition key and row key must be strings and are required",
"entity",
".",
"PartitionKey",
"=",
"'pk{}'",
".",
"format",
"(",
"str",
"(",
"uuid",
".",
"uuid4",
"(",
")",
... | Creates a class-based entity with fixed values, using all of the supported data types. | [
"Creates",
"a",
"class",
"-",
"based",
"entity",
"with",
"fixed",
"values",
"using",
"all",
"of",
"the",
"supported",
"data",
"types",
"."
] | a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0 | https://github.com/Azure/azure-cosmos-table-python/blob/a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0/azure-cosmosdb-table/samples/table/table_usage.py#L203-L225 | train |
Azure/azure-cosmos-table-python | azure-cosmosdb-table/samples/table/table_usage.py | TableSamples.create_entity_dict | def create_entity_dict(self):
'''
Creates a dict-based entity with fixed values, using all of the supported data types.
'''
entity = {}
# Partition key and row key must be strings and are required
entity['PartitionKey'] = 'pk{}'.format(str(uuid.uuid4()).replace('-', ''))... | python | def create_entity_dict(self):
'''
Creates a dict-based entity with fixed values, using all of the supported data types.
'''
entity = {}
# Partition key and row key must be strings and are required
entity['PartitionKey'] = 'pk{}'.format(str(uuid.uuid4()).replace('-', ''))... | [
"def",
"create_entity_dict",
"(",
"self",
")",
":",
"entity",
"=",
"{",
"}",
"# Partition key and row key must be strings and are required",
"entity",
"[",
"'PartitionKey'",
"]",
"=",
"'pk{}'",
".",
"format",
"(",
"str",
"(",
"uuid",
".",
"uuid4",
"(",
")",
")",... | Creates a dict-based entity with fixed values, using all of the supported data types. | [
"Creates",
"a",
"dict",
"-",
"based",
"entity",
"with",
"fixed",
"values",
"using",
"all",
"of",
"the",
"supported",
"data",
"types",
"."
] | a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0 | https://github.com/Azure/azure-cosmos-table-python/blob/a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0/azure-cosmosdb-table/samples/table/table_usage.py#L227-L249 | train |
Azure/azure-cosmos-table-python | azure-cosmosdb-table/azure/cosmosdb/table/_serialization.py | _convert_batch_to_json | def _convert_batch_to_json(batch_requests):
'''
Create json to send for an array of batch requests.
batch_requests:
an array of requests
'''
batch_boundary = b'batch_' + _new_boundary()
changeset_boundary = b'changeset_' + _new_boundary()
body = [b'--' + batch_boundary + b'\n',
... | python | def _convert_batch_to_json(batch_requests):
'''
Create json to send for an array of batch requests.
batch_requests:
an array of requests
'''
batch_boundary = b'batch_' + _new_boundary()
changeset_boundary = b'changeset_' + _new_boundary()
body = [b'--' + batch_boundary + b'\n',
... | [
"def",
"_convert_batch_to_json",
"(",
"batch_requests",
")",
":",
"batch_boundary",
"=",
"b'batch_'",
"+",
"_new_boundary",
"(",
")",
"changeset_boundary",
"=",
"b'changeset_'",
"+",
"_new_boundary",
"(",
")",
"body",
"=",
"[",
"b'--'",
"+",
"batch_boundary",
"+",... | Create json to send for an array of batch requests.
batch_requests:
an array of requests | [
"Create",
"json",
"to",
"send",
"for",
"an",
"array",
"of",
"batch",
"requests",
"."
] | a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0 | https://github.com/Azure/azure-cosmos-table-python/blob/a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0/azure-cosmosdb-table/azure/cosmosdb/table/_serialization.py#L220-L266 | train |
Azure/azure-cosmos-table-python | azure-cosmosdb-table/azure/cosmosdb/table/_encryption.py | _decrypt_entity | def _decrypt_entity(entity, encrypted_properties_list, content_encryption_key, entityIV, isJavaV1):
'''
Decrypts the specified entity using AES256 in CBC mode with 128 bit padding. Unwraps the CEK
using either the specified KEK or the key returned by the key_resolver. Properties
specified in the encry... | python | def _decrypt_entity(entity, encrypted_properties_list, content_encryption_key, entityIV, isJavaV1):
'''
Decrypts the specified entity using AES256 in CBC mode with 128 bit padding. Unwraps the CEK
using either the specified KEK or the key returned by the key_resolver. Properties
specified in the encry... | [
"def",
"_decrypt_entity",
"(",
"entity",
",",
"encrypted_properties_list",
",",
"content_encryption_key",
",",
"entityIV",
",",
"isJavaV1",
")",
":",
"_validate_not_none",
"(",
"'entity'",
",",
"entity",
")",
"decrypted_entity",
"=",
"deepcopy",
"(",
"entity",
")",
... | Decrypts the specified entity using AES256 in CBC mode with 128 bit padding. Unwraps the CEK
using either the specified KEK or the key returned by the key_resolver. Properties
specified in the encrypted_properties_list, will be decrypted and decoded to utf-8 strings.
:param entity:
The entity bei... | [
"Decrypts",
"the",
"specified",
"entity",
"using",
"AES256",
"in",
"CBC",
"mode",
"with",
"128",
"bit",
"padding",
".",
"Unwraps",
"the",
"CEK",
"using",
"either",
"the",
"specified",
"KEK",
"or",
"the",
"key",
"returned",
"by",
"the",
"key_resolver",
".",
... | a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0 | https://github.com/Azure/azure-cosmos-table-python/blob/a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0/azure-cosmosdb-table/azure/cosmosdb/table/_encryption.py#L163-L212 | train |
Azure/azure-cosmos-table-python | azure-cosmosdb-table/azure/cosmosdb/table/_encryption.py | _generate_property_iv | def _generate_property_iv(entity_iv, pk, rk, property_name, isJavaV1):
'''
Uses the entity_iv, partition key, and row key to generate and return
the iv for the specified property.
'''
digest = Hash(SHA256(), default_backend())
if not isJavaV1:
digest.update(entity_iv +
... | python | def _generate_property_iv(entity_iv, pk, rk, property_name, isJavaV1):
'''
Uses the entity_iv, partition key, and row key to generate and return
the iv for the specified property.
'''
digest = Hash(SHA256(), default_backend())
if not isJavaV1:
digest.update(entity_iv +
... | [
"def",
"_generate_property_iv",
"(",
"entity_iv",
",",
"pk",
",",
"rk",
",",
"property_name",
",",
"isJavaV1",
")",
":",
"digest",
"=",
"Hash",
"(",
"SHA256",
"(",
")",
",",
"default_backend",
"(",
")",
")",
"if",
"not",
"isJavaV1",
":",
"digest",
".",
... | Uses the entity_iv, partition key, and row key to generate and return
the iv for the specified property. | [
"Uses",
"the",
"entity_iv",
"partition",
"key",
"and",
"row",
"key",
"to",
"generate",
"and",
"return",
"the",
"iv",
"for",
"the",
"specified",
"property",
"."
] | a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0 | https://github.com/Azure/azure-cosmos-table-python/blob/a7b618f6bddc465c9fdf899ea2971dfe4d04fcf0/azure-cosmosdb-table/azure/cosmosdb/table/_encryption.py#L287-L300 | train |
fuhrysteve/marshmallow-jsonschema | marshmallow_jsonschema/base.py | JSONSchema._get_default_mapping | def _get_default_mapping(self, obj):
"""Return default mapping if there are no special needs."""
mapping = {v: k for k, v in obj.TYPE_MAPPING.items()}
mapping.update({
fields.Email: text_type,
fields.Dict: dict,
fields.Url: text_type,
fields.List: ... | python | def _get_default_mapping(self, obj):
"""Return default mapping if there are no special needs."""
mapping = {v: k for k, v in obj.TYPE_MAPPING.items()}
mapping.update({
fields.Email: text_type,
fields.Dict: dict,
fields.Url: text_type,
fields.List: ... | [
"def",
"_get_default_mapping",
"(",
"self",
",",
"obj",
")",
":",
"mapping",
"=",
"{",
"v",
":",
"k",
"for",
"k",
",",
"v",
"in",
"obj",
".",
"TYPE_MAPPING",
".",
"items",
"(",
")",
"}",
"mapping",
".",
"update",
"(",
"{",
"fields",
".",
"Email",
... | Return default mapping if there are no special needs. | [
"Return",
"default",
"mapping",
"if",
"there",
"are",
"no",
"special",
"needs",
"."
] | 3e0891a79d586c49deb75188d9ee1728597d093b | https://github.com/fuhrysteve/marshmallow-jsonschema/blob/3e0891a79d586c49deb75188d9ee1728597d093b/marshmallow_jsonschema/base.py#L96-L107 | train |
fuhrysteve/marshmallow-jsonschema | marshmallow_jsonschema/base.py | JSONSchema.get_properties | def get_properties(self, obj):
"""Fill out properties field."""
properties = {}
for field_name, field in sorted(obj.fields.items()):
schema = self._get_schema_for_field(obj, field)
properties[field.name] = schema
return properties | python | def get_properties(self, obj):
"""Fill out properties field."""
properties = {}
for field_name, field in sorted(obj.fields.items()):
schema = self._get_schema_for_field(obj, field)
properties[field.name] = schema
return properties | [
"def",
"get_properties",
"(",
"self",
",",
"obj",
")",
":",
"properties",
"=",
"{",
"}",
"for",
"field_name",
",",
"field",
"in",
"sorted",
"(",
"obj",
".",
"fields",
".",
"items",
"(",
")",
")",
":",
"schema",
"=",
"self",
".",
"_get_schema_for_field"... | Fill out properties field. | [
"Fill",
"out",
"properties",
"field",
"."
] | 3e0891a79d586c49deb75188d9ee1728597d093b | https://github.com/fuhrysteve/marshmallow-jsonschema/blob/3e0891a79d586c49deb75188d9ee1728597d093b/marshmallow_jsonschema/base.py#L109-L117 | train |
fuhrysteve/marshmallow-jsonschema | marshmallow_jsonschema/base.py | JSONSchema.get_required | def get_required(self, obj):
"""Fill out required field."""
required = []
for field_name, field in sorted(obj.fields.items()):
if field.required:
required.append(field.name)
return required or missing | python | def get_required(self, obj):
"""Fill out required field."""
required = []
for field_name, field in sorted(obj.fields.items()):
if field.required:
required.append(field.name)
return required or missing | [
"def",
"get_required",
"(",
"self",
",",
"obj",
")",
":",
"required",
"=",
"[",
"]",
"for",
"field_name",
",",
"field",
"in",
"sorted",
"(",
"obj",
".",
"fields",
".",
"items",
"(",
")",
")",
":",
"if",
"field",
".",
"required",
":",
"required",
".... | Fill out required field. | [
"Fill",
"out",
"required",
"field",
"."
] | 3e0891a79d586c49deb75188d9ee1728597d093b | https://github.com/fuhrysteve/marshmallow-jsonschema/blob/3e0891a79d586c49deb75188d9ee1728597d093b/marshmallow_jsonschema/base.py#L119-L127 | train |
fuhrysteve/marshmallow-jsonschema | marshmallow_jsonschema/base.py | JSONSchema._from_python_type | def _from_python_type(self, obj, field, pytype):
"""Get schema definition from python type."""
json_schema = {
'title': field.attribute or field.name,
}
for key, val in TYPE_MAP[pytype].items():
json_schema[key] = val
if field.dump_only:
json... | python | def _from_python_type(self, obj, field, pytype):
"""Get schema definition from python type."""
json_schema = {
'title': field.attribute or field.name,
}
for key, val in TYPE_MAP[pytype].items():
json_schema[key] = val
if field.dump_only:
json... | [
"def",
"_from_python_type",
"(",
"self",
",",
"obj",
",",
"field",
",",
"pytype",
")",
":",
"json_schema",
"=",
"{",
"'title'",
":",
"field",
".",
"attribute",
"or",
"field",
".",
"name",
",",
"}",
"for",
"key",
",",
"val",
"in",
"TYPE_MAP",
"[",
"py... | Get schema definition from python type. | [
"Get",
"schema",
"definition",
"from",
"python",
"type",
"."
] | 3e0891a79d586c49deb75188d9ee1728597d093b | https://github.com/fuhrysteve/marshmallow-jsonschema/blob/3e0891a79d586c49deb75188d9ee1728597d093b/marshmallow_jsonschema/base.py#L129-L157 | train |
fuhrysteve/marshmallow-jsonschema | marshmallow_jsonschema/base.py | JSONSchema._get_schema_for_field | def _get_schema_for_field(self, obj, field):
"""Get schema and validators for field."""
mapping = self._get_default_mapping(obj)
if hasattr(field, '_jsonschema_type_mapping'):
schema = field._jsonschema_type_mapping()
elif '_jsonschema_type_mapping' in field.metadata:
... | python | def _get_schema_for_field(self, obj, field):
"""Get schema and validators for field."""
mapping = self._get_default_mapping(obj)
if hasattr(field, '_jsonschema_type_mapping'):
schema = field._jsonschema_type_mapping()
elif '_jsonschema_type_mapping' in field.metadata:
... | [
"def",
"_get_schema_for_field",
"(",
"self",
",",
"obj",
",",
"field",
")",
":",
"mapping",
"=",
"self",
".",
"_get_default_mapping",
"(",
"obj",
")",
"if",
"hasattr",
"(",
"field",
",",
"'_jsonschema_type_mapping'",
")",
":",
"schema",
"=",
"field",
".",
... | Get schema and validators for field. | [
"Get",
"schema",
"and",
"validators",
"for",
"field",
"."
] | 3e0891a79d586c49deb75188d9ee1728597d093b | https://github.com/fuhrysteve/marshmallow-jsonschema/blob/3e0891a79d586c49deb75188d9ee1728597d093b/marshmallow_jsonschema/base.py#L159-L183 | train |
fuhrysteve/marshmallow-jsonschema | marshmallow_jsonschema/base.py | JSONSchema._from_nested_schema | def _from_nested_schema(self, obj, field):
"""Support nested field."""
if isinstance(field.nested, basestring):
nested = get_class(field.nested)
else:
nested = field.nested
name = nested.__name__
outer_name = obj.__class__.__name__
only = field.on... | python | def _from_nested_schema(self, obj, field):
"""Support nested field."""
if isinstance(field.nested, basestring):
nested = get_class(field.nested)
else:
nested = field.nested
name = nested.__name__
outer_name = obj.__class__.__name__
only = field.on... | [
"def",
"_from_nested_schema",
"(",
"self",
",",
"obj",
",",
"field",
")",
":",
"if",
"isinstance",
"(",
"field",
".",
"nested",
",",
"basestring",
")",
":",
"nested",
"=",
"get_class",
"(",
"field",
".",
"nested",
")",
"else",
":",
"nested",
"=",
"fiel... | Support nested field. | [
"Support",
"nested",
"field",
"."
] | 3e0891a79d586c49deb75188d9ee1728597d093b | https://github.com/fuhrysteve/marshmallow-jsonschema/blob/3e0891a79d586c49deb75188d9ee1728597d093b/marshmallow_jsonschema/base.py#L185-L236 | train |
fuhrysteve/marshmallow-jsonschema | marshmallow_jsonschema/base.py | JSONSchema.wrap | def wrap(self, data):
"""Wrap this with the root schema definitions."""
if self.nested: # no need to wrap, will be in outer defs
return data
name = self.obj.__class__.__name__
self._nested_schema_classes[name] = data
root = {
'definitions': self._nested_... | python | def wrap(self, data):
"""Wrap this with the root schema definitions."""
if self.nested: # no need to wrap, will be in outer defs
return data
name = self.obj.__class__.__name__
self._nested_schema_classes[name] = data
root = {
'definitions': self._nested_... | [
"def",
"wrap",
"(",
"self",
",",
"data",
")",
":",
"if",
"self",
".",
"nested",
":",
"# no need to wrap, will be in outer defs",
"return",
"data",
"name",
"=",
"self",
".",
"obj",
".",
"__class__",
".",
"__name__",
"self",
".",
"_nested_schema_classes",
"[",
... | Wrap this with the root schema definitions. | [
"Wrap",
"this",
"with",
"the",
"root",
"schema",
"definitions",
"."
] | 3e0891a79d586c49deb75188d9ee1728597d093b | https://github.com/fuhrysteve/marshmallow-jsonschema/blob/3e0891a79d586c49deb75188d9ee1728597d093b/marshmallow_jsonschema/base.py#L244-L255 | train |
fuhrysteve/marshmallow-jsonschema | marshmallow_jsonschema/validation.py | handle_length | def handle_length(schema, field, validator, parent_schema):
"""Adds validation logic for ``marshmallow.validate.Length``, setting the
values appropriately for ``fields.List``, ``fields.Nested``, and
``fields.String``.
Args:
schema (dict): The original JSON schema we generated. This is what we
... | python | def handle_length(schema, field, validator, parent_schema):
"""Adds validation logic for ``marshmallow.validate.Length``, setting the
values appropriately for ``fields.List``, ``fields.Nested``, and
``fields.String``.
Args:
schema (dict): The original JSON schema we generated. This is what we
... | [
"def",
"handle_length",
"(",
"schema",
",",
"field",
",",
"validator",
",",
"parent_schema",
")",
":",
"if",
"isinstance",
"(",
"field",
",",
"fields",
".",
"String",
")",
":",
"minKey",
"=",
"'minLength'",
"maxKey",
"=",
"'maxLength'",
"elif",
"isinstance",... | Adds validation logic for ``marshmallow.validate.Length``, setting the
values appropriately for ``fields.List``, ``fields.Nested``, and
``fields.String``.
Args:
schema (dict): The original JSON schema we generated. This is what we
want to post-process.
field (fields.Field): The ... | [
"Adds",
"validation",
"logic",
"for",
"marshmallow",
".",
"validate",
".",
"Length",
"setting",
"the",
"values",
"appropriately",
"for",
"fields",
".",
"List",
"fields",
".",
"Nested",
"and",
"fields",
".",
"String",
"."
] | 3e0891a79d586c49deb75188d9ee1728597d093b | https://github.com/fuhrysteve/marshmallow-jsonschema/blob/3e0891a79d586c49deb75188d9ee1728597d093b/marshmallow_jsonschema/validation.py#L4-L47 | train |
fuhrysteve/marshmallow-jsonschema | marshmallow_jsonschema/validation.py | handle_one_of | def handle_one_of(schema, field, validator, parent_schema):
"""Adds the validation logic for ``marshmallow.validate.OneOf`` by setting
the JSONSchema `enum` property to the allowed choices in the validator.
Args:
schema (dict): The original JSON schema we generated. This is what we
want... | python | def handle_one_of(schema, field, validator, parent_schema):
"""Adds the validation logic for ``marshmallow.validate.OneOf`` by setting
the JSONSchema `enum` property to the allowed choices in the validator.
Args:
schema (dict): The original JSON schema we generated. This is what we
want... | [
"def",
"handle_one_of",
"(",
"schema",
",",
"field",
",",
"validator",
",",
"parent_schema",
")",
":",
"if",
"validator",
".",
"choices",
":",
"schema",
"[",
"'enum'",
"]",
"=",
"list",
"(",
"validator",
".",
"choices",
")",
"schema",
"[",
"'enumNames'",
... | Adds the validation logic for ``marshmallow.validate.OneOf`` by setting
the JSONSchema `enum` property to the allowed choices in the validator.
Args:
schema (dict): The original JSON schema we generated. This is what we
want to post-process.
field (fields.Field): The field that gene... | [
"Adds",
"the",
"validation",
"logic",
"for",
"marshmallow",
".",
"validate",
".",
"OneOf",
"by",
"setting",
"the",
"JSONSchema",
"enum",
"property",
"to",
"the",
"allowed",
"choices",
"in",
"the",
"validator",
"."
] | 3e0891a79d586c49deb75188d9ee1728597d093b | https://github.com/fuhrysteve/marshmallow-jsonschema/blob/3e0891a79d586c49deb75188d9ee1728597d093b/marshmallow_jsonschema/validation.py#L50-L72 | train |
fuhrysteve/marshmallow-jsonschema | marshmallow_jsonschema/validation.py | handle_range | def handle_range(schema, field, validator, parent_schema):
"""Adds validation logic for ``marshmallow.validate.Range``, setting the
values appropriately ``fields.Number`` and it's subclasses.
Args:
schema (dict): The original JSON schema we generated. This is what we
want to post-proces... | python | def handle_range(schema, field, validator, parent_schema):
"""Adds validation logic for ``marshmallow.validate.Range``, setting the
values appropriately ``fields.Number`` and it's subclasses.
Args:
schema (dict): The original JSON schema we generated. This is what we
want to post-proces... | [
"def",
"handle_range",
"(",
"schema",
",",
"field",
",",
"validator",
",",
"parent_schema",
")",
":",
"if",
"not",
"isinstance",
"(",
"field",
",",
"fields",
".",
"Number",
")",
":",
"return",
"schema",
"if",
"validator",
".",
"min",
":",
"schema",
"[",
... | Adds validation logic for ``marshmallow.validate.Range``, setting the
values appropriately ``fields.Number`` and it's subclasses.
Args:
schema (dict): The original JSON schema we generated. This is what we
want to post-process.
field (fields.Field): The field that generated the orig... | [
"Adds",
"validation",
"logic",
"for",
"marshmallow",
".",
"validate",
".",
"Range",
"setting",
"the",
"values",
"appropriately",
"fields",
".",
"Number",
"and",
"it",
"s",
"subclasses",
"."
] | 3e0891a79d586c49deb75188d9ee1728597d093b | https://github.com/fuhrysteve/marshmallow-jsonschema/blob/3e0891a79d586c49deb75188d9ee1728597d093b/marshmallow_jsonschema/validation.py#L75-L107 | train |
mmp2/megaman | megaman/utils/eigendecomp.py | check_eigen_solver | def check_eigen_solver(eigen_solver, solver_kwds, size=None, nvec=None):
"""Check that the selected eigensolver is valid
Parameters
----------
eigen_solver : string
string value to validate
size, nvec : int (optional)
if both provided, use the specified problem size and number of ve... | python | def check_eigen_solver(eigen_solver, solver_kwds, size=None, nvec=None):
"""Check that the selected eigensolver is valid
Parameters
----------
eigen_solver : string
string value to validate
size, nvec : int (optional)
if both provided, use the specified problem size and number of ve... | [
"def",
"check_eigen_solver",
"(",
"eigen_solver",
",",
"solver_kwds",
",",
"size",
"=",
"None",
",",
"nvec",
"=",
"None",
")",
":",
"if",
"eigen_solver",
"in",
"BAD_EIGEN_SOLVERS",
":",
"raise",
"ValueError",
"(",
"BAD_EIGEN_SOLVERS",
"[",
"eigen_solver",
"]",
... | Check that the selected eigensolver is valid
Parameters
----------
eigen_solver : string
string value to validate
size, nvec : int (optional)
if both provided, use the specified problem size and number of vectors
to determine the optimal method to use with eigen_solver='auto'
... | [
"Check",
"that",
"the",
"selected",
"eigensolver",
"is",
"valid"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/utils/eigendecomp.py#L28-L72 | train |
mmp2/megaman | megaman/relaxation/precomputed.py | precompute_optimzation_Y | def precompute_optimzation_Y(laplacian_matrix, n_samples, relaxation_kwds):
"""compute Lk, neighbors and subset to index map for projected == False"""
relaxation_kwds.setdefault('presave',False)
relaxation_kwds.setdefault('presave_name','pre_comp_current.npy')
relaxation_kwds.setdefault('verbose',False)... | python | def precompute_optimzation_Y(laplacian_matrix, n_samples, relaxation_kwds):
"""compute Lk, neighbors and subset to index map for projected == False"""
relaxation_kwds.setdefault('presave',False)
relaxation_kwds.setdefault('presave_name','pre_comp_current.npy')
relaxation_kwds.setdefault('verbose',False)... | [
"def",
"precompute_optimzation_Y",
"(",
"laplacian_matrix",
",",
"n_samples",
",",
"relaxation_kwds",
")",
":",
"relaxation_kwds",
".",
"setdefault",
"(",
"'presave'",
",",
"False",
")",
"relaxation_kwds",
".",
"setdefault",
"(",
"'presave_name'",
",",
"'pre_comp_curr... | compute Lk, neighbors and subset to index map for projected == False | [
"compute",
"Lk",
"neighbors",
"and",
"subset",
"to",
"index",
"map",
"for",
"projected",
"==",
"False"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/precomputed.py#L8-L19 | train |
mmp2/megaman | megaman/relaxation/precomputed.py | compute_Lk | def compute_Lk(laplacian_matrix,n_samples,subset):
"""
Compute sparse L matrix, neighbors and subset to L matrix index map.
Returns
-------
Lk_tensor : array-like. Length = n
each component correspond to the sparse matrix of Lk, which is
generated by extracting the kth row of laplac... | python | def compute_Lk(laplacian_matrix,n_samples,subset):
"""
Compute sparse L matrix, neighbors and subset to L matrix index map.
Returns
-------
Lk_tensor : array-like. Length = n
each component correspond to the sparse matrix of Lk, which is
generated by extracting the kth row of laplac... | [
"def",
"compute_Lk",
"(",
"laplacian_matrix",
",",
"n_samples",
",",
"subset",
")",
":",
"Lk_tensor",
"=",
"[",
"]",
"nbk",
"=",
"[",
"]",
"row",
",",
"column",
"=",
"laplacian_matrix",
".",
"T",
".",
"nonzero",
"(",
")",
"nnz_val",
"=",
"np",
".",
"... | Compute sparse L matrix, neighbors and subset to L matrix index map.
Returns
-------
Lk_tensor : array-like. Length = n
each component correspond to the sparse matrix of Lk, which is
generated by extracting the kth row of laplacian and removing zeros.
nbk : array-like. Length = n
... | [
"Compute",
"sparse",
"L",
"matrix",
"neighbors",
"and",
"subset",
"to",
"L",
"matrix",
"index",
"map",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/precomputed.py#L21-L71 | train |
mmp2/megaman | megaman/relaxation/precomputed.py | precompute_optimzation_S | def precompute_optimzation_S(laplacian_matrix,n_samples,relaxation_kwds):
"""compute Rk, A, ATAinv, neighbors and pairs for projected mode"""
relaxation_kwds.setdefault('presave',False)
relaxation_kwds.setdefault('presave_name','pre_comp_current.npy')
relaxation_kwds.setdefault('verbose',False)
if r... | python | def precompute_optimzation_S(laplacian_matrix,n_samples,relaxation_kwds):
"""compute Rk, A, ATAinv, neighbors and pairs for projected mode"""
relaxation_kwds.setdefault('presave',False)
relaxation_kwds.setdefault('presave_name','pre_comp_current.npy')
relaxation_kwds.setdefault('verbose',False)
if r... | [
"def",
"precompute_optimzation_S",
"(",
"laplacian_matrix",
",",
"n_samples",
",",
"relaxation_kwds",
")",
":",
"relaxation_kwds",
".",
"setdefault",
"(",
"'presave'",
",",
"False",
")",
"relaxation_kwds",
".",
"setdefault",
"(",
"'presave_name'",
",",
"'pre_comp_curr... | compute Rk, A, ATAinv, neighbors and pairs for projected mode | [
"compute",
"Rk",
"A",
"ATAinv",
"neighbors",
"and",
"pairs",
"for",
"projected",
"mode"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/precomputed.py#L73-L92 | train |
mmp2/megaman | megaman/relaxation/precomputed.py | compute_Rk | def compute_Rk(L,A,n_samples):
# TODO: need to inspect more into compute Rk.
"""
Compute sparse L matrix and neighbors.
Returns
-------
Rk_tensor : array-like. Length = n
each component correspond to the sparse matrix of Lk, which is
generated by extracting the kth row of laplac... | python | def compute_Rk(L,A,n_samples):
# TODO: need to inspect more into compute Rk.
"""
Compute sparse L matrix and neighbors.
Returns
-------
Rk_tensor : array-like. Length = n
each component correspond to the sparse matrix of Lk, which is
generated by extracting the kth row of laplac... | [
"def",
"compute_Rk",
"(",
"L",
",",
"A",
",",
"n_samples",
")",
":",
"# TODO: need to inspect more into compute Rk.",
"laplacian_matrix",
"=",
"L",
".",
"copy",
"(",
")",
"laplacian_matrix",
".",
"setdiag",
"(",
"0",
")",
"laplacian_matrix",
".",
"eliminate_zeros"... | Compute sparse L matrix and neighbors.
Returns
-------
Rk_tensor : array-like. Length = n
each component correspond to the sparse matrix of Lk, which is
generated by extracting the kth row of laplacian and removing zeros.
nbk : array-like. Length = n
each component correspond to... | [
"Compute",
"sparse",
"L",
"matrix",
"and",
"neighbors",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/precomputed.py#L116-L161 | train |
mmp2/megaman | doc/sphinxext/numpy_ext/automodapi.py | _mod_info | def _mod_info(modname, toskip=[], onlylocals=True):
"""
Determines if a module is a module or a package and whether or not
it has classes or functions.
"""
hascls = hasfunc = False
for localnm, fqnm, obj in zip(*find_mod_objs(modname, onlylocals=onlylocals)):
if localnm not in toskip:
... | python | def _mod_info(modname, toskip=[], onlylocals=True):
"""
Determines if a module is a module or a package and whether or not
it has classes or functions.
"""
hascls = hasfunc = False
for localnm, fqnm, obj in zip(*find_mod_objs(modname, onlylocals=onlylocals)):
if localnm not in toskip:
... | [
"def",
"_mod_info",
"(",
"modname",
",",
"toskip",
"=",
"[",
"]",
",",
"onlylocals",
"=",
"True",
")",
":",
"hascls",
"=",
"hasfunc",
"=",
"False",
"for",
"localnm",
",",
"fqnm",
",",
"obj",
"in",
"zip",
"(",
"*",
"find_mod_objs",
"(",
"modname",
","... | Determines if a module is a module or a package and whether or not
it has classes or functions. | [
"Determines",
"if",
"a",
"module",
"is",
"a",
"module",
"or",
"a",
"package",
"and",
"whether",
"or",
"not",
"it",
"has",
"classes",
"or",
"functions",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/doc/sphinxext/numpy_ext/automodapi.py#L328-L350 | train |
mmp2/megaman | megaman/geometry/affinity.py | compute_affinity_matrix | def compute_affinity_matrix(adjacency_matrix, method='auto', **kwargs):
"""Compute the affinity matrix with the given method"""
if method == 'auto':
method = 'gaussian'
return Affinity.init(method, **kwargs).affinity_matrix(adjacency_matrix) | python | def compute_affinity_matrix(adjacency_matrix, method='auto', **kwargs):
"""Compute the affinity matrix with the given method"""
if method == 'auto':
method = 'gaussian'
return Affinity.init(method, **kwargs).affinity_matrix(adjacency_matrix) | [
"def",
"compute_affinity_matrix",
"(",
"adjacency_matrix",
",",
"method",
"=",
"'auto'",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"method",
"==",
"'auto'",
":",
"method",
"=",
"'gaussian'",
"return",
"Affinity",
".",
"init",
"(",
"method",
",",
"*",
"*",
... | Compute the affinity matrix with the given method | [
"Compute",
"the",
"affinity",
"matrix",
"with",
"the",
"given",
"method"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/geometry/affinity.py#L11-L15 | train |
mmp2/megaman | megaman/embedding/locally_linear.py | barycenter_graph | def barycenter_graph(distance_matrix, X, reg=1e-3):
"""
Computes the barycenter weighted graph for points in X
Parameters
----------
distance_matrix: sparse Ndarray, (N_obs, N_obs) pairwise distance matrix.
X : Ndarray (N_obs, N_dim) observed data matrix.
reg : float, optional
Amoun... | python | def barycenter_graph(distance_matrix, X, reg=1e-3):
"""
Computes the barycenter weighted graph for points in X
Parameters
----------
distance_matrix: sparse Ndarray, (N_obs, N_obs) pairwise distance matrix.
X : Ndarray (N_obs, N_dim) observed data matrix.
reg : float, optional
Amoun... | [
"def",
"barycenter_graph",
"(",
"distance_matrix",
",",
"X",
",",
"reg",
"=",
"1e-3",
")",
":",
"(",
"N",
",",
"d_in",
")",
"=",
"X",
".",
"shape",
"(",
"rows",
",",
"cols",
")",
"=",
"distance_matrix",
".",
"nonzero",
"(",
")",
"W",
"=",
"sparse",... | Computes the barycenter weighted graph for points in X
Parameters
----------
distance_matrix: sparse Ndarray, (N_obs, N_obs) pairwise distance matrix.
X : Ndarray (N_obs, N_dim) observed data matrix.
reg : float, optional
Amount of regularization when solving the least-squares
probl... | [
"Computes",
"the",
"barycenter",
"weighted",
"graph",
"for",
"points",
"in",
"X"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/embedding/locally_linear.py#L22-L57 | train |
mmp2/megaman | megaman/embedding/locally_linear.py | locally_linear_embedding | def locally_linear_embedding(geom, n_components, reg=1e-3,
eigen_solver='auto', random_state=None,
solver_kwds=None):
"""
Perform a Locally Linear Embedding analysis on the data.
Parameters
----------
geom : a Geometry object from megaman.geo... | python | def locally_linear_embedding(geom, n_components, reg=1e-3,
eigen_solver='auto', random_state=None,
solver_kwds=None):
"""
Perform a Locally Linear Embedding analysis on the data.
Parameters
----------
geom : a Geometry object from megaman.geo... | [
"def",
"locally_linear_embedding",
"(",
"geom",
",",
"n_components",
",",
"reg",
"=",
"1e-3",
",",
"eigen_solver",
"=",
"'auto'",
",",
"random_state",
"=",
"None",
",",
"solver_kwds",
"=",
"None",
")",
":",
"if",
"geom",
".",
"X",
"is",
"None",
":",
"rai... | Perform a Locally Linear Embedding analysis on the data.
Parameters
----------
geom : a Geometry object from megaman.geometry.geometry
n_components : integer
number of coordinates for the manifold.
reg : float
regularization constant, multiplies the trace of the local covariance
... | [
"Perform",
"a",
"Locally",
"Linear",
"Embedding",
"analysis",
"on",
"the",
"data",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/embedding/locally_linear.py#L60-L128 | train |
mmp2/megaman | megaman/utils/validation.py | _num_samples | def _num_samples(x):
"""Return number of samples in array-like x."""
if hasattr(x, 'fit'):
# Don't get num_samples from an ensembles length!
raise TypeError('Expected sequence or array-like, got '
'estimator %s' % x)
if not hasattr(x, '__len__') and not hasattr(x, 'sh... | python | def _num_samples(x):
"""Return number of samples in array-like x."""
if hasattr(x, 'fit'):
# Don't get num_samples from an ensembles length!
raise TypeError('Expected sequence or array-like, got '
'estimator %s' % x)
if not hasattr(x, '__len__') and not hasattr(x, 'sh... | [
"def",
"_num_samples",
"(",
"x",
")",
":",
"if",
"hasattr",
"(",
"x",
",",
"'fit'",
")",
":",
"# Don't get num_samples from an ensembles length!",
"raise",
"TypeError",
"(",
"'Expected sequence or array-like, got '",
"'estimator %s'",
"%",
"x",
")",
"if",
"not",
"ha... | Return number of samples in array-like x. | [
"Return",
"number",
"of",
"samples",
"in",
"array",
"-",
"like",
"x",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/utils/validation.py#L68-L86 | train |
mmp2/megaman | megaman/utils/spectral_clustering.py | spectral_clustering | def spectral_clustering(geom, K, eigen_solver = 'dense', random_state = None, solver_kwds = None,
renormalize = True, stabalize = True, additional_vectors = 0):
"""
Spectral clustering for find K clusters by using the eigenvectors of a
matrix which is derived from a set of similari... | python | def spectral_clustering(geom, K, eigen_solver = 'dense', random_state = None, solver_kwds = None,
renormalize = True, stabalize = True, additional_vectors = 0):
"""
Spectral clustering for find K clusters by using the eigenvectors of a
matrix which is derived from a set of similari... | [
"def",
"spectral_clustering",
"(",
"geom",
",",
"K",
",",
"eigen_solver",
"=",
"'dense'",
",",
"random_state",
"=",
"None",
",",
"solver_kwds",
"=",
"None",
",",
"renormalize",
"=",
"True",
",",
"stabalize",
"=",
"True",
",",
"additional_vectors",
"=",
"0",
... | Spectral clustering for find K clusters by using the eigenvectors of a
matrix which is derived from a set of similarities S.
Parameters
-----------
S: array-like,shape(n_sample,n_sample)
similarity matrix
K: integer
number of K clusters
eigen_solver : {'auto', 'dense', 'arpack... | [
"Spectral",
"clustering",
"for",
"find",
"K",
"clusters",
"by",
"using",
"the",
"eigenvectors",
"of",
"a",
"matrix",
"which",
"is",
"derived",
"from",
"a",
"set",
"of",
"similarities",
"S",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/utils/spectral_clustering.py#L94-L193 | train |
mmp2/megaman | megaman/plotter/covar_plotter3.py | pathpatch_2d_to_3d | def pathpatch_2d_to_3d(pathpatch, z = 0, normal = 'z'):
"""
Transforms a 2D Patch to a 3D patch using the given normal vector.
The patch is projected into they XY plane, rotated about the origin
and finally translated by z.
"""
if type(normal) is str: #Translate strings to normal vectors
... | python | def pathpatch_2d_to_3d(pathpatch, z = 0, normal = 'z'):
"""
Transforms a 2D Patch to a 3D patch using the given normal vector.
The patch is projected into they XY plane, rotated about the origin
and finally translated by z.
"""
if type(normal) is str: #Translate strings to normal vectors
... | [
"def",
"pathpatch_2d_to_3d",
"(",
"pathpatch",
",",
"z",
"=",
"0",
",",
"normal",
"=",
"'z'",
")",
":",
"if",
"type",
"(",
"normal",
")",
"is",
"str",
":",
"#Translate strings to normal vectors",
"index",
"=",
"\"xyz\"",
".",
"index",
"(",
"normal",
")",
... | Transforms a 2D Patch to a 3D patch using the given normal vector.
The patch is projected into they XY plane, rotated about the origin
and finally translated by z. | [
"Transforms",
"a",
"2D",
"Patch",
"to",
"a",
"3D",
"patch",
"using",
"the",
"given",
"normal",
"vector",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/plotter/covar_plotter3.py#L44-L73 | train |
mmp2/megaman | megaman/plotter/covar_plotter3.py | calc_2d_ellipse_properties | def calc_2d_ellipse_properties(cov,nstd=2):
"""Calculate the properties for 2d ellipse given the covariance matrix."""
def eigsorted(cov):
vals, vecs = np.linalg.eigh(cov)
order = vals.argsort()[::-1]
return vals[order], vecs[:,order]
vals, vecs = eigsorted(cov)
width, height = ... | python | def calc_2d_ellipse_properties(cov,nstd=2):
"""Calculate the properties for 2d ellipse given the covariance matrix."""
def eigsorted(cov):
vals, vecs = np.linalg.eigh(cov)
order = vals.argsort()[::-1]
return vals[order], vecs[:,order]
vals, vecs = eigsorted(cov)
width, height = ... | [
"def",
"calc_2d_ellipse_properties",
"(",
"cov",
",",
"nstd",
"=",
"2",
")",
":",
"def",
"eigsorted",
"(",
"cov",
")",
":",
"vals",
",",
"vecs",
"=",
"np",
".",
"linalg",
".",
"eigh",
"(",
"cov",
")",
"order",
"=",
"vals",
".",
"argsort",
"(",
")",... | Calculate the properties for 2d ellipse given the covariance matrix. | [
"Calculate",
"the",
"properties",
"for",
"2d",
"ellipse",
"given",
"the",
"covariance",
"matrix",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/plotter/covar_plotter3.py#L101-L116 | train |
mmp2/megaman | megaman/plotter/covar_plotter3.py | rotation_matrix | def rotation_matrix(d):
"""
Calculates a rotation matrix given a vector d. The direction of d
corresponds to the rotation axis. The length of d corresponds to
the sin of the angle of rotation.
Variant of: http://mail.scipy.org/pipermail/numpy-discussion/2009-March/040806.html
"""
sin_angle ... | python | def rotation_matrix(d):
"""
Calculates a rotation matrix given a vector d. The direction of d
corresponds to the rotation axis. The length of d corresponds to
the sin of the angle of rotation.
Variant of: http://mail.scipy.org/pipermail/numpy-discussion/2009-March/040806.html
"""
sin_angle ... | [
"def",
"rotation_matrix",
"(",
"d",
")",
":",
"sin_angle",
"=",
"np",
".",
"linalg",
".",
"norm",
"(",
"d",
")",
"if",
"sin_angle",
"==",
"0",
":",
"return",
"np",
".",
"identity",
"(",
"3",
")",
"d",
"/=",
"sin_angle",
"eye",
"=",
"np",
".",
"ey... | Calculates a rotation matrix given a vector d. The direction of d
corresponds to the rotation axis. The length of d corresponds to
the sin of the angle of rotation.
Variant of: http://mail.scipy.org/pipermail/numpy-discussion/2009-March/040806.html | [
"Calculates",
"a",
"rotation",
"matrix",
"given",
"a",
"vector",
"d",
".",
"The",
"direction",
"of",
"d",
"corresponds",
"to",
"the",
"rotation",
"axis",
".",
"The",
"length",
"of",
"d",
"corresponds",
"to",
"the",
"sin",
"of",
"the",
"angle",
"of",
"rot... | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/plotter/covar_plotter3.py#L118-L140 | train |
mmp2/megaman | megaman/plotter/covar_plotter3.py | create_ellipse | def create_ellipse(width,height,angle):
"""Create parametric ellipse from 200 points."""
angle = angle / 180.0 * np.pi
thetas = np.linspace(0,2*np.pi,200)
a = width / 2.0
b = height / 2.0
x = a*np.cos(thetas)*np.cos(angle) - b*np.sin(thetas)*np.sin(angle)
y = a*np.cos(thetas)*np.sin(angle) ... | python | def create_ellipse(width,height,angle):
"""Create parametric ellipse from 200 points."""
angle = angle / 180.0 * np.pi
thetas = np.linspace(0,2*np.pi,200)
a = width / 2.0
b = height / 2.0
x = a*np.cos(thetas)*np.cos(angle) - b*np.sin(thetas)*np.sin(angle)
y = a*np.cos(thetas)*np.sin(angle) ... | [
"def",
"create_ellipse",
"(",
"width",
",",
"height",
",",
"angle",
")",
":",
"angle",
"=",
"angle",
"/",
"180.0",
"*",
"np",
".",
"pi",
"thetas",
"=",
"np",
".",
"linspace",
"(",
"0",
",",
"2",
"*",
"np",
".",
"pi",
",",
"200",
")",
"a",
"=",
... | Create parametric ellipse from 200 points. | [
"Create",
"parametric",
"ellipse",
"from",
"200",
"points",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/plotter/covar_plotter3.py#L142-L152 | train |
mmp2/megaman | megaman/plotter/covar_plotter3.py | transform_to_3d | def transform_to_3d(points,normal,z=0):
"""Project points into 3d from 2d points."""
d = np.cross(normal, (0, 0, 1))
M = rotation_matrix(d)
transformed_points = M.dot(points.T).T + z
return transformed_points | python | def transform_to_3d(points,normal,z=0):
"""Project points into 3d from 2d points."""
d = np.cross(normal, (0, 0, 1))
M = rotation_matrix(d)
transformed_points = M.dot(points.T).T + z
return transformed_points | [
"def",
"transform_to_3d",
"(",
"points",
",",
"normal",
",",
"z",
"=",
"0",
")",
":",
"d",
"=",
"np",
".",
"cross",
"(",
"normal",
",",
"(",
"0",
",",
"0",
",",
"1",
")",
")",
"M",
"=",
"rotation_matrix",
"(",
"d",
")",
"transformed_points",
"=",... | Project points into 3d from 2d points. | [
"Project",
"points",
"into",
"3d",
"from",
"2d",
"points",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/plotter/covar_plotter3.py#L154-L159 | train |
mmp2/megaman | megaman/plotter/covar_plotter3.py | create_ellipse_mesh | def create_ellipse_mesh(points,**kwargs):
"""Visualize the ellipse by using the mesh of the points."""
import plotly.graph_objs as go
x,y,z = points.T
return (go.Mesh3d(x=x,y=y,z=z,**kwargs),
go.Scatter3d(x=x, y=y, z=z,
marker=dict(size=0.01),
... | python | def create_ellipse_mesh(points,**kwargs):
"""Visualize the ellipse by using the mesh of the points."""
import plotly.graph_objs as go
x,y,z = points.T
return (go.Mesh3d(x=x,y=y,z=z,**kwargs),
go.Scatter3d(x=x, y=y, z=z,
marker=dict(size=0.01),
... | [
"def",
"create_ellipse_mesh",
"(",
"points",
",",
"*",
"*",
"kwargs",
")",
":",
"import",
"plotly",
".",
"graph_objs",
"as",
"go",
"x",
",",
"y",
",",
"z",
"=",
"points",
".",
"T",
"return",
"(",
"go",
".",
"Mesh3d",
"(",
"x",
"=",
"x",
",",
"y",... | Visualize the ellipse by using the mesh of the points. | [
"Visualize",
"the",
"ellipse",
"by",
"using",
"the",
"mesh",
"of",
"the",
"points",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/plotter/covar_plotter3.py#L166-L177 | train |
mmp2/megaman | megaman/embedding/ltsa.py | ltsa | def ltsa(geom, n_components, eigen_solver='auto',
random_state=None, solver_kwds=None):
"""
Perform a Local Tangent Space Alignment analysis on the data.
Parameters
----------
geom : a Geometry object from megaman.geometry.geometry
n_components : integer
number of coordinates f... | python | def ltsa(geom, n_components, eigen_solver='auto',
random_state=None, solver_kwds=None):
"""
Perform a Local Tangent Space Alignment analysis on the data.
Parameters
----------
geom : a Geometry object from megaman.geometry.geometry
n_components : integer
number of coordinates f... | [
"def",
"ltsa",
"(",
"geom",
",",
"n_components",
",",
"eigen_solver",
"=",
"'auto'",
",",
"random_state",
"=",
"None",
",",
"solver_kwds",
"=",
"None",
")",
":",
"if",
"geom",
".",
"X",
"is",
"None",
":",
"raise",
"ValueError",
"(",
"\"Must pass data matri... | Perform a Local Tangent Space Alignment analysis on the data.
Parameters
----------
geom : a Geometry object from megaman.geometry.geometry
n_components : integer
number of coordinates for the manifold.
eigen_solver : {'auto', 'dense', 'arpack', 'lobpcg', or 'amg'}
'auto' :
... | [
"Perform",
"a",
"Local",
"Tangent",
"Space",
"Alignment",
"analysis",
"on",
"the",
"data",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/embedding/ltsa.py#L24-L111 | train |
mmp2/megaman | megaman/relaxation/riemannian_relaxation.py | run_riemannian_relaxation | def run_riemannian_relaxation(laplacian, initial_guess,
intrinsic_dim, relaxation_kwds):
"""Helper function for creating a RiemannianRelaxation class."""
n, s = initial_guess.shape
relaxation_kwds = initialize_kwds(relaxation_kwds, n, s, intrinsic_dim)
if relaxation_kwds['s... | python | def run_riemannian_relaxation(laplacian, initial_guess,
intrinsic_dim, relaxation_kwds):
"""Helper function for creating a RiemannianRelaxation class."""
n, s = initial_guess.shape
relaxation_kwds = initialize_kwds(relaxation_kwds, n, s, intrinsic_dim)
if relaxation_kwds['s... | [
"def",
"run_riemannian_relaxation",
"(",
"laplacian",
",",
"initial_guess",
",",
"intrinsic_dim",
",",
"relaxation_kwds",
")",
":",
"n",
",",
"s",
"=",
"initial_guess",
".",
"shape",
"relaxation_kwds",
"=",
"initialize_kwds",
"(",
"relaxation_kwds",
",",
"n",
",",... | Helper function for creating a RiemannianRelaxation class. | [
"Helper",
"function",
"for",
"creating",
"a",
"RiemannianRelaxation",
"class",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/riemannian_relaxation.py#L19-L32 | train |
mmp2/megaman | megaman/relaxation/riemannian_relaxation.py | RiemannianRelaxation.relax_isometry | def relax_isometry(self):
"""Main function for doing riemannian relaxation."""
for ii in range(self.relaxation_kwds['niter']):
self.H = self.compute_dual_rmetric()
self.loss = self.rieman_loss()
self.trace_var.update(ii,self.H,self.Y,self.eta,self.loss)
s... | python | def relax_isometry(self):
"""Main function for doing riemannian relaxation."""
for ii in range(self.relaxation_kwds['niter']):
self.H = self.compute_dual_rmetric()
self.loss = self.rieman_loss()
self.trace_var.update(ii,self.H,self.Y,self.eta,self.loss)
s... | [
"def",
"relax_isometry",
"(",
"self",
")",
":",
"for",
"ii",
"in",
"range",
"(",
"self",
".",
"relaxation_kwds",
"[",
"'niter'",
"]",
")",
":",
"self",
".",
"H",
"=",
"self",
".",
"compute_dual_rmetric",
"(",
")",
"self",
".",
"loss",
"=",
"self",
".... | Main function for doing riemannian relaxation. | [
"Main",
"function",
"for",
"doing",
"riemannian",
"relaxation",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/riemannian_relaxation.py#L77-L96 | train |
mmp2/megaman | megaman/relaxation/riemannian_relaxation.py | RiemannianRelaxation.calc_loss | def calc_loss(self, embedding):
"""Helper function to calculate rieman loss given new embedding"""
Hnew = self.compute_dual_rmetric(Ynew=embedding)
return self.rieman_loss(Hnew=Hnew) | python | def calc_loss(self, embedding):
"""Helper function to calculate rieman loss given new embedding"""
Hnew = self.compute_dual_rmetric(Ynew=embedding)
return self.rieman_loss(Hnew=Hnew) | [
"def",
"calc_loss",
"(",
"self",
",",
"embedding",
")",
":",
"Hnew",
"=",
"self",
".",
"compute_dual_rmetric",
"(",
"Ynew",
"=",
"embedding",
")",
"return",
"self",
".",
"rieman_loss",
"(",
"Hnew",
"=",
"Hnew",
")"
] | Helper function to calculate rieman loss given new embedding | [
"Helper",
"function",
"to",
"calculate",
"rieman",
"loss",
"given",
"new",
"embedding"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/riemannian_relaxation.py#L98-L101 | train |
mmp2/megaman | megaman/relaxation/riemannian_relaxation.py | RiemannianRelaxation.compute_dual_rmetric | def compute_dual_rmetric(self,Ynew=None):
"""Helper function to calculate the """
usedY = self.Y if Ynew is None else Ynew
rieman_metric = RiemannMetric(usedY, self.laplacian_matrix)
return rieman_metric.get_dual_rmetric() | python | def compute_dual_rmetric(self,Ynew=None):
"""Helper function to calculate the """
usedY = self.Y if Ynew is None else Ynew
rieman_metric = RiemannMetric(usedY, self.laplacian_matrix)
return rieman_metric.get_dual_rmetric() | [
"def",
"compute_dual_rmetric",
"(",
"self",
",",
"Ynew",
"=",
"None",
")",
":",
"usedY",
"=",
"self",
".",
"Y",
"if",
"Ynew",
"is",
"None",
"else",
"Ynew",
"rieman_metric",
"=",
"RiemannMetric",
"(",
"usedY",
",",
"self",
".",
"laplacian_matrix",
")",
"r... | Helper function to calculate the | [
"Helper",
"function",
"to",
"calculate",
"the"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/riemannian_relaxation.py#L103-L107 | train |
mmp2/megaman | doc/sphinxext/numpy_ext/automodsumm.py | automodsumm_to_autosummary_lines | def automodsumm_to_autosummary_lines(fn, app):
"""
Generates lines from a file with an "automodsumm" entry suitable for
feeding into "autosummary".
Searches the provided file for `automodsumm` directives and returns
a list of lines specifying the `autosummary` commands for the modules
requested... | python | def automodsumm_to_autosummary_lines(fn, app):
"""
Generates lines from a file with an "automodsumm" entry suitable for
feeding into "autosummary".
Searches the provided file for `automodsumm` directives and returns
a list of lines specifying the `autosummary` commands for the modules
requested... | [
"def",
"automodsumm_to_autosummary_lines",
"(",
"fn",
",",
"app",
")",
":",
"fullfn",
"=",
"os",
".",
"path",
".",
"join",
"(",
"app",
".",
"builder",
".",
"env",
".",
"srcdir",
",",
"fn",
")",
"with",
"open",
"(",
"fullfn",
")",
"as",
"fr",
":",
"... | Generates lines from a file with an "automodsumm" entry suitable for
feeding into "autosummary".
Searches the provided file for `automodsumm` directives and returns
a list of lines specifying the `autosummary` commands for the modules
requested. This does *not* return the whole file contents - just an
... | [
"Generates",
"lines",
"from",
"a",
"file",
"with",
"an",
"automodsumm",
"entry",
"suitable",
"for",
"feeding",
"into",
"autosummary",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/doc/sphinxext/numpy_ext/automodsumm.py#L265-L369 | train |
mmp2/megaman | megaman/geometry/adjacency.py | compute_adjacency_matrix | def compute_adjacency_matrix(X, method='auto', **kwargs):
"""Compute an adjacency matrix with the given method"""
if method == 'auto':
if X.shape[0] > 10000:
method = 'cyflann'
else:
method = 'kd_tree'
return Adjacency.init(method, **kwargs).adjacency_graph(X.astype('... | python | def compute_adjacency_matrix(X, method='auto', **kwargs):
"""Compute an adjacency matrix with the given method"""
if method == 'auto':
if X.shape[0] > 10000:
method = 'cyflann'
else:
method = 'kd_tree'
return Adjacency.init(method, **kwargs).adjacency_graph(X.astype('... | [
"def",
"compute_adjacency_matrix",
"(",
"X",
",",
"method",
"=",
"'auto'",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"method",
"==",
"'auto'",
":",
"if",
"X",
".",
"shape",
"[",
"0",
"]",
">",
"10000",
":",
"method",
"=",
"'cyflann'",
"else",
":",
"... | Compute an adjacency matrix with the given method | [
"Compute",
"an",
"adjacency",
"matrix",
"with",
"the",
"given",
"method"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/geometry/adjacency.py#L17-L24 | train |
mmp2/megaman | megaman/relaxation/utils.py | split_kwargs | def split_kwargs(relaxation_kwds):
"""Split relaxation keywords to keywords for optimizer and others"""
optimizer_keys_list = [
'step_method',
'linesearch',
'eta_max',
'eta',
'm',
'linesearch_first'
]
optimizer_kwargs = { k:relaxation_kwds.pop(k) for k in ... | python | def split_kwargs(relaxation_kwds):
"""Split relaxation keywords to keywords for optimizer and others"""
optimizer_keys_list = [
'step_method',
'linesearch',
'eta_max',
'eta',
'm',
'linesearch_first'
]
optimizer_kwargs = { k:relaxation_kwds.pop(k) for k in ... | [
"def",
"split_kwargs",
"(",
"relaxation_kwds",
")",
":",
"optimizer_keys_list",
"=",
"[",
"'step_method'",
",",
"'linesearch'",
",",
"'eta_max'",
",",
"'eta'",
",",
"'m'",
",",
"'linesearch_first'",
"]",
"optimizer_kwargs",
"=",
"{",
"k",
":",
"relaxation_kwds",
... | Split relaxation keywords to keywords for optimizer and others | [
"Split",
"relaxation",
"keywords",
"to",
"keywords",
"for",
"optimizer",
"and",
"others"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/utils.py#L10-L23 | train |
mmp2/megaman | megaman/relaxation/utils.py | initialize_kwds | def initialize_kwds(relaxation_kwds, n_samples, n_components, intrinsic_dim):
"""
Initialize relaxation keywords.
Parameters
----------
relaxation_kwds : dict
weights : numpy array, the weights
step_method : string { 'fixed', 'momentum' }
which optimizers to use
... | python | def initialize_kwds(relaxation_kwds, n_samples, n_components, intrinsic_dim):
"""
Initialize relaxation keywords.
Parameters
----------
relaxation_kwds : dict
weights : numpy array, the weights
step_method : string { 'fixed', 'momentum' }
which optimizers to use
... | [
"def",
"initialize_kwds",
"(",
"relaxation_kwds",
",",
"n_samples",
",",
"n_components",
",",
"intrinsic_dim",
")",
":",
"new_relaxation_kwds",
"=",
"{",
"'weights'",
":",
"np",
".",
"array",
"(",
"[",
"]",
",",
"dtype",
"=",
"np",
".",
"float64",
")",
","... | Initialize relaxation keywords.
Parameters
----------
relaxation_kwds : dict
weights : numpy array, the weights
step_method : string { 'fixed', 'momentum' }
which optimizers to use
linesearch : bool
whether to do linesearch in search for eta in optimization
... | [
"Initialize",
"relaxation",
"keywords",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/utils.py#L26-L127 | train |
mmp2/megaman | megaman/embedding/spectral_embedding.py | _graph_connected_component | def _graph_connected_component(graph, node_id):
"""
Find the largest graph connected components the contains one
given node
Parameters
----------
graph : array-like, shape: (n_samples, n_samples)
adjacency matrix of the graph, non-zero weight means an edge
between the nodes
... | python | def _graph_connected_component(graph, node_id):
"""
Find the largest graph connected components the contains one
given node
Parameters
----------
graph : array-like, shape: (n_samples, n_samples)
adjacency matrix of the graph, non-zero weight means an edge
between the nodes
... | [
"def",
"_graph_connected_component",
"(",
"graph",
",",
"node_id",
")",
":",
"connected_components",
"=",
"np",
".",
"zeros",
"(",
"shape",
"=",
"(",
"graph",
".",
"shape",
"[",
"0",
"]",
")",
",",
"dtype",
"=",
"np",
".",
"bool",
")",
"connected_compone... | Find the largest graph connected components the contains one
given node
Parameters
----------
graph : array-like, shape: (n_samples, n_samples)
adjacency matrix of the graph, non-zero weight means an edge
between the nodes
node_id : int
The index of the query node of the gr... | [
"Find",
"the",
"largest",
"graph",
"connected",
"components",
"the",
"contains",
"one",
"given",
"node"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/embedding/spectral_embedding.py#L28-L58 | train |
mmp2/megaman | megaman/embedding/spectral_embedding.py | SpectralEmbedding.predict | def predict(self, X_test, y=None):
"""
Predict embedding on new data X_test given the existing embedding on training data
Uses the Nystrom Extension to estimate the eigenvectors.
Currently only works with input_type data (i.e. not affinity or distance)
"""
if not hasatt... | python | def predict(self, X_test, y=None):
"""
Predict embedding on new data X_test given the existing embedding on training data
Uses the Nystrom Extension to estimate the eigenvectors.
Currently only works with input_type data (i.e. not affinity or distance)
"""
if not hasatt... | [
"def",
"predict",
"(",
"self",
",",
"X_test",
",",
"y",
"=",
"None",
")",
":",
"if",
"not",
"hasattr",
"(",
"self",
",",
"'geom_'",
")",
":",
"raise",
"RuntimeError",
"(",
"'the .fit() function must be called before the .predict() function'",
")",
"if",
"self",
... | Predict embedding on new data X_test given the existing embedding on training data
Uses the Nystrom Extension to estimate the eigenvectors.
Currently only works with input_type data (i.e. not affinity or distance) | [
"Predict",
"embedding",
"on",
"new",
"data",
"X_test",
"given",
"the",
"existing",
"embedding",
"on",
"training",
"data"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/embedding/spectral_embedding.py#L408-L465 | train |
mmp2/megaman | megaman/geometry/laplacian.py | compute_laplacian_matrix | def compute_laplacian_matrix(affinity_matrix, method='auto', **kwargs):
"""Compute the laplacian matrix with the given method"""
if method == 'auto':
method = 'geometric'
return Laplacian.init(method, **kwargs).laplacian_matrix(affinity_matrix) | python | def compute_laplacian_matrix(affinity_matrix, method='auto', **kwargs):
"""Compute the laplacian matrix with the given method"""
if method == 'auto':
method = 'geometric'
return Laplacian.init(method, **kwargs).laplacian_matrix(affinity_matrix) | [
"def",
"compute_laplacian_matrix",
"(",
"affinity_matrix",
",",
"method",
"=",
"'auto'",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"method",
"==",
"'auto'",
":",
"method",
"=",
"'geometric'",
"return",
"Laplacian",
".",
"init",
"(",
"method",
",",
"*",
"*",... | Compute the laplacian matrix with the given method | [
"Compute",
"the",
"laplacian",
"matrix",
"with",
"the",
"given",
"method"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/geometry/laplacian.py#L10-L14 | train |
mmp2/megaman | megaman/embedding/base.py | BaseEmbedding.fit_geometry | def fit_geometry(self, X=None, input_type='data'):
"""Inputs self.geom, and produces the fitted geometry self.geom_"""
if self.geom is None:
self.geom_ = Geometry()
elif isinstance(self.geom, Geometry):
self.geom_ = self.geom
else:
try:
... | python | def fit_geometry(self, X=None, input_type='data'):
"""Inputs self.geom, and produces the fitted geometry self.geom_"""
if self.geom is None:
self.geom_ = Geometry()
elif isinstance(self.geom, Geometry):
self.geom_ = self.geom
else:
try:
... | [
"def",
"fit_geometry",
"(",
"self",
",",
"X",
"=",
"None",
",",
"input_type",
"=",
"'data'",
")",
":",
"if",
"self",
".",
"geom",
"is",
"None",
":",
"self",
".",
"geom_",
"=",
"Geometry",
"(",
")",
"elif",
"isinstance",
"(",
"self",
".",
"geom",
",... | Inputs self.geom, and produces the fitted geometry self.geom_ | [
"Inputs",
"self",
".",
"geom",
"and",
"produces",
"the",
"fitted",
"geometry",
"self",
".",
"geom_"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/embedding/base.py#L87-L115 | train |
mmp2/megaman | megaman/geometry/geometry.py | Geometry.set_radius | def set_radius(self, radius, override=True, X=None, n_components=2):
"""Set the radius for the adjacency and affinity computation
By default, this will override keyword arguments provided on
initialization.
Parameters
----------
radius : float
radius to set ... | python | def set_radius(self, radius, override=True, X=None, n_components=2):
"""Set the radius for the adjacency and affinity computation
By default, this will override keyword arguments provided on
initialization.
Parameters
----------
radius : float
radius to set ... | [
"def",
"set_radius",
"(",
"self",
",",
"radius",
",",
"override",
"=",
"True",
",",
"X",
"=",
"None",
",",
"n_components",
"=",
"2",
")",
":",
"if",
"radius",
"<",
"0",
":",
"raise",
"ValueError",
"(",
"\"radius must be non-negative\"",
")",
"if",
"overr... | Set the radius for the adjacency and affinity computation
By default, this will override keyword arguments provided on
initialization.
Parameters
----------
radius : float
radius to set for adjacency and affinity.
override : bool (default: True)
... | [
"Set",
"the",
"radius",
"for",
"the",
"adjacency",
"and",
"affinity",
"computation"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/geometry/geometry.py#L114-L140 | train |
mmp2/megaman | megaman/geometry/rmetric.py | RiemannMetric.get_rmetric | def get_rmetric( self, mode_inv = 'svd', return_svd = False ):
"""
Compute the Reimannian Metric
"""
if self.H is None:
self.H, self.G, self.Hvv, self.Hsval = riemann_metric(self.Y, self.L, self.mdimG, invert_h = True, mode_inv = mode_inv)
if self.G is None:
... | python | def get_rmetric( self, mode_inv = 'svd', return_svd = False ):
"""
Compute the Reimannian Metric
"""
if self.H is None:
self.H, self.G, self.Hvv, self.Hsval = riemann_metric(self.Y, self.L, self.mdimG, invert_h = True, mode_inv = mode_inv)
if self.G is None:
... | [
"def",
"get_rmetric",
"(",
"self",
",",
"mode_inv",
"=",
"'svd'",
",",
"return_svd",
"=",
"False",
")",
":",
"if",
"self",
".",
"H",
"is",
"None",
":",
"self",
".",
"H",
",",
"self",
".",
"G",
",",
"self",
".",
"Hvv",
",",
"self",
".",
"Hsval",
... | Compute the Reimannian Metric | [
"Compute",
"the",
"Reimannian",
"Metric"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/geometry/rmetric.py#L270-L281 | train |
mmp2/megaman | megaman/relaxation/trace_variable.py | TracingVariable.report_and_save_keywords | def report_and_save_keywords(self,relaxation_kwds,precomputed_kwds):
"""Save relaxation keywords to .txt and .pyc file"""
report_name = os.path.join(self.backup_dir,'relaxation_keywords.txt')
pretty_relax_kwds = pprint.pformat(relaxation_kwds,indent=4)
with open(report_name,'w') as wf:
... | python | def report_and_save_keywords(self,relaxation_kwds,precomputed_kwds):
"""Save relaxation keywords to .txt and .pyc file"""
report_name = os.path.join(self.backup_dir,'relaxation_keywords.txt')
pretty_relax_kwds = pprint.pformat(relaxation_kwds,indent=4)
with open(report_name,'w') as wf:
... | [
"def",
"report_and_save_keywords",
"(",
"self",
",",
"relaxation_kwds",
",",
"precomputed_kwds",
")",
":",
"report_name",
"=",
"os",
".",
"path",
".",
"join",
"(",
"self",
".",
"backup_dir",
",",
"'relaxation_keywords.txt'",
")",
"pretty_relax_kwds",
"=",
"pprint"... | Save relaxation keywords to .txt and .pyc file | [
"Save",
"relaxation",
"keywords",
"to",
".",
"txt",
"and",
".",
"pyc",
"file"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/trace_variable.py#L36-L55 | train |
mmp2/megaman | megaman/relaxation/trace_variable.py | TracingVariable.update | def update(self,iiter,H,Y,eta,loss):
"""Update the trace_var in new iteration"""
if iiter <= self.niter_trace+1:
self.H[iiter] = H
self.Y[iiter] = Y
elif iiter >self.niter - self.niter_trace + 1:
self.H[self.ltrace+iiter-self.niter-1] = H
self.Y[se... | python | def update(self,iiter,H,Y,eta,loss):
"""Update the trace_var in new iteration"""
if iiter <= self.niter_trace+1:
self.H[iiter] = H
self.Y[iiter] = Y
elif iiter >self.niter - self.niter_trace + 1:
self.H[self.ltrace+iiter-self.niter-1] = H
self.Y[se... | [
"def",
"update",
"(",
"self",
",",
"iiter",
",",
"H",
",",
"Y",
",",
"eta",
",",
"loss",
")",
":",
"if",
"iiter",
"<=",
"self",
".",
"niter_trace",
"+",
"1",
":",
"self",
".",
"H",
"[",
"iiter",
"]",
"=",
"H",
"self",
".",
"Y",
"[",
"iiter",
... | Update the trace_var in new iteration | [
"Update",
"the",
"trace_var",
"in",
"new",
"iteration"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/trace_variable.py#L57-L71 | train |
mmp2/megaman | megaman/relaxation/trace_variable.py | TracingVariable.save | def save(cls,instance,filename):
"""Class method save for saving TracingVariable."""
filename = cls.correct_file_extension(filename)
try:
with open(filename,'wb') as f:
pickle.dump(instance,f,protocol=pickle.HIGHEST_PROTOCOL)
except MemoryError as e:
... | python | def save(cls,instance,filename):
"""Class method save for saving TracingVariable."""
filename = cls.correct_file_extension(filename)
try:
with open(filename,'wb') as f:
pickle.dump(instance,f,protocol=pickle.HIGHEST_PROTOCOL)
except MemoryError as e:
... | [
"def",
"save",
"(",
"cls",
",",
"instance",
",",
"filename",
")",
":",
"filename",
"=",
"cls",
".",
"correct_file_extension",
"(",
"filename",
")",
"try",
":",
"with",
"open",
"(",
"filename",
",",
"'wb'",
")",
"as",
"f",
":",
"pickle",
".",
"dump",
... | Class method save for saving TracingVariable. | [
"Class",
"method",
"save",
"for",
"saving",
"TracingVariable",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/trace_variable.py#L93-L106 | train |
mmp2/megaman | megaman/relaxation/trace_variable.py | TracingVariable.load | def load(cls,filename):
"""Load from stored files"""
filename = cls.correct_file_extension(filename)
with open(filename,'rb') as f:
return pickle.load(f) | python | def load(cls,filename):
"""Load from stored files"""
filename = cls.correct_file_extension(filename)
with open(filename,'rb') as f:
return pickle.load(f) | [
"def",
"load",
"(",
"cls",
",",
"filename",
")",
":",
"filename",
"=",
"cls",
".",
"correct_file_extension",
"(",
"filename",
")",
"with",
"open",
"(",
"filename",
",",
"'rb'",
")",
"as",
"f",
":",
"return",
"pickle",
".",
"load",
"(",
"f",
")"
] | Load from stored files | [
"Load",
"from",
"stored",
"files"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/relaxation/trace_variable.py#L109-L113 | train |
mmp2/megaman | doc/sphinxext/numpy_ext/utils.py | find_mod_objs | def find_mod_objs(modname, onlylocals=False):
""" Returns all the public attributes of a module referenced by name.
.. note::
The returned list *not* include subpackages or modules of
`modname`,nor does it include private attributes (those that
beginwith '_' or are not in `__all__`).
... | python | def find_mod_objs(modname, onlylocals=False):
""" Returns all the public attributes of a module referenced by name.
.. note::
The returned list *not* include subpackages or modules of
`modname`,nor does it include private attributes (those that
beginwith '_' or are not in `__all__`).
... | [
"def",
"find_mod_objs",
"(",
"modname",
",",
"onlylocals",
"=",
"False",
")",
":",
"__import__",
"(",
"modname",
")",
"mod",
"=",
"sys",
".",
"modules",
"[",
"modname",
"]",
"if",
"hasattr",
"(",
"mod",
",",
"'__all__'",
")",
":",
"pkgitems",
"=",
"[",... | Returns all the public attributes of a module referenced by name.
.. note::
The returned list *not* include subpackages or modules of
`modname`,nor does it include private attributes (those that
beginwith '_' or are not in `__all__`).
Parameters
----------
modname : str
... | [
"Returns",
"all",
"the",
"public",
"attributes",
"of",
"a",
"module",
"referenced",
"by",
"name",
"."
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/doc/sphinxext/numpy_ext/utils.py#L5-L65 | train |
mmp2/megaman | megaman/datasets/datasets.py | get_megaman_image | def get_megaman_image(factor=1):
"""Return an RGBA representation of the megaman icon"""
imfile = os.path.join(os.path.dirname(__file__), 'megaman.png')
data = ndimage.imread(imfile) / 255
if factor > 1:
data = data.repeat(factor, axis=0).repeat(factor, axis=1)
return data | python | def get_megaman_image(factor=1):
"""Return an RGBA representation of the megaman icon"""
imfile = os.path.join(os.path.dirname(__file__), 'megaman.png')
data = ndimage.imread(imfile) / 255
if factor > 1:
data = data.repeat(factor, axis=0).repeat(factor, axis=1)
return data | [
"def",
"get_megaman_image",
"(",
"factor",
"=",
"1",
")",
":",
"imfile",
"=",
"os",
".",
"path",
".",
"join",
"(",
"os",
".",
"path",
".",
"dirname",
"(",
"__file__",
")",
",",
"'megaman.png'",
")",
"data",
"=",
"ndimage",
".",
"imread",
"(",
"imfile... | Return an RGBA representation of the megaman icon | [
"Return",
"an",
"RGBA",
"representation",
"of",
"the",
"megaman",
"icon"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/datasets/datasets.py#L12-L18 | train |
mmp2/megaman | megaman/datasets/datasets.py | generate_megaman_data | def generate_megaman_data(sampling=2):
"""Generate 2D point data of the megaman image"""
data = get_megaman_image()
x = np.arange(sampling * data.shape[1]) / float(sampling)
y = np.arange(sampling * data.shape[0]) / float(sampling)
X, Y = map(np.ravel, np.meshgrid(x, y))
C = data[np.floor(Y.max(... | python | def generate_megaman_data(sampling=2):
"""Generate 2D point data of the megaman image"""
data = get_megaman_image()
x = np.arange(sampling * data.shape[1]) / float(sampling)
y = np.arange(sampling * data.shape[0]) / float(sampling)
X, Y = map(np.ravel, np.meshgrid(x, y))
C = data[np.floor(Y.max(... | [
"def",
"generate_megaman_data",
"(",
"sampling",
"=",
"2",
")",
":",
"data",
"=",
"get_megaman_image",
"(",
")",
"x",
"=",
"np",
".",
"arange",
"(",
"sampling",
"*",
"data",
".",
"shape",
"[",
"1",
"]",
")",
"/",
"float",
"(",
"sampling",
")",
"y",
... | Generate 2D point data of the megaman image | [
"Generate",
"2D",
"point",
"data",
"of",
"the",
"megaman",
"image"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/datasets/datasets.py#L21-L29 | train |
mmp2/megaman | megaman/datasets/datasets.py | _make_S_curve | def _make_S_curve(x, range=(-0.75, 0.75)):
"""Make a 2D S-curve from a 1D vector"""
assert x.ndim == 1
x = x - x.min()
theta = 2 * np.pi * (range[0] + (range[1] - range[0]) * x / x.max())
X = np.empty((x.shape[0], 2), dtype=float)
X[:, 0] = np.sign(theta) * (1 - np.cos(theta))
X[:, 1] = np.s... | python | def _make_S_curve(x, range=(-0.75, 0.75)):
"""Make a 2D S-curve from a 1D vector"""
assert x.ndim == 1
x = x - x.min()
theta = 2 * np.pi * (range[0] + (range[1] - range[0]) * x / x.max())
X = np.empty((x.shape[0], 2), dtype=float)
X[:, 0] = np.sign(theta) * (1 - np.cos(theta))
X[:, 1] = np.s... | [
"def",
"_make_S_curve",
"(",
"x",
",",
"range",
"=",
"(",
"-",
"0.75",
",",
"0.75",
")",
")",
":",
"assert",
"x",
".",
"ndim",
"==",
"1",
"x",
"=",
"x",
"-",
"x",
".",
"min",
"(",
")",
"theta",
"=",
"2",
"*",
"np",
".",
"pi",
"*",
"(",
"r... | Make a 2D S-curve from a 1D vector | [
"Make",
"a",
"2D",
"S",
"-",
"curve",
"from",
"a",
"1D",
"vector"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/datasets/datasets.py#L32-L41 | train |
mmp2/megaman | megaman/datasets/datasets.py | generate_megaman_manifold | def generate_megaman_manifold(sampling=2, nfolds=2,
rotate=True, random_state=None):
"""Generate a manifold of the megaman data"""
X, c = generate_megaman_data(sampling)
for i in range(nfolds):
X = np.hstack([_make_S_curve(x) for x in X.T])
if rotate:
rand ... | python | def generate_megaman_manifold(sampling=2, nfolds=2,
rotate=True, random_state=None):
"""Generate a manifold of the megaman data"""
X, c = generate_megaman_data(sampling)
for i in range(nfolds):
X = np.hstack([_make_S_curve(x) for x in X.T])
if rotate:
rand ... | [
"def",
"generate_megaman_manifold",
"(",
"sampling",
"=",
"2",
",",
"nfolds",
"=",
"2",
",",
"rotate",
"=",
"True",
",",
"random_state",
"=",
"None",
")",
":",
"X",
",",
"c",
"=",
"generate_megaman_data",
"(",
"sampling",
")",
"for",
"i",
"in",
"range",
... | Generate a manifold of the megaman data | [
"Generate",
"a",
"manifold",
"of",
"the",
"megaman",
"data"
] | faccaf267aad0a8b18ec8a705735fd9dd838ca1e | https://github.com/mmp2/megaman/blob/faccaf267aad0a8b18ec8a705735fd9dd838ca1e/megaman/datasets/datasets.py#L44-L57 | train |
presslabs/z3 | z3/ssh_sync.py | snapshots_to_send | def snapshots_to_send(source_snaps, dest_snaps):
"""return pair of snapshots"""
if len(source_snaps) == 0:
raise AssertionError("No snapshots exist locally!")
if len(dest_snaps) == 0:
# nothing on the remote side, send everything
return None, source_snaps[-1]
last_remote = dest_s... | python | def snapshots_to_send(source_snaps, dest_snaps):
"""return pair of snapshots"""
if len(source_snaps) == 0:
raise AssertionError("No snapshots exist locally!")
if len(dest_snaps) == 0:
# nothing on the remote side, send everything
return None, source_snaps[-1]
last_remote = dest_s... | [
"def",
"snapshots_to_send",
"(",
"source_snaps",
",",
"dest_snaps",
")",
":",
"if",
"len",
"(",
"source_snaps",
")",
"==",
"0",
":",
"raise",
"AssertionError",
"(",
"\"No snapshots exist locally!\"",
")",
"if",
"len",
"(",
"dest_snaps",
")",
"==",
"0",
":",
... | return pair of snapshots | [
"return",
"pair",
"of",
"snapshots"
] | 965898cccddd351ce4c56402a215c3bda9f37b5e | https://github.com/presslabs/z3/blob/965898cccddd351ce4c56402a215c3bda9f37b5e/z3/ssh_sync.py#L25-L38 | train |
presslabs/z3 | z3/pput.py | StreamHandler.get_chunk | def get_chunk(self):
"""Return complete chunks or None if EOF reached"""
while not self._eof_reached:
read = self.input_stream.read(self.chunk_size - len(self._partial_chunk))
if len(read) == 0:
self._eof_reached = True
self._partial_chunk += read
... | python | def get_chunk(self):
"""Return complete chunks or None if EOF reached"""
while not self._eof_reached:
read = self.input_stream.read(self.chunk_size - len(self._partial_chunk))
if len(read) == 0:
self._eof_reached = True
self._partial_chunk += read
... | [
"def",
"get_chunk",
"(",
"self",
")",
":",
"while",
"not",
"self",
".",
"_eof_reached",
":",
"read",
"=",
"self",
".",
"input_stream",
".",
"read",
"(",
"self",
".",
"chunk_size",
"-",
"len",
"(",
"self",
".",
"_partial_chunk",
")",
")",
"if",
"len",
... | Return complete chunks or None if EOF reached | [
"Return",
"complete",
"chunks",
"or",
"None",
"if",
"EOF",
"reached"
] | 965898cccddd351ce4c56402a215c3bda9f37b5e | https://github.com/presslabs/z3/blob/965898cccddd351ce4c56402a215c3bda9f37b5e/z3/pput.py#L76-L86 | train |
presslabs/z3 | z3/pput.py | UploadSupervisor._handle_result | def _handle_result(self):
"""Process one result. Block untill one is available
"""
result = self.inbox.get()
if result.success:
if self._verbosity >= VERB_PROGRESS:
sys.stderr.write("\nuploaded chunk {} \n".format(result.index))
self.results.append... | python | def _handle_result(self):
"""Process one result. Block untill one is available
"""
result = self.inbox.get()
if result.success:
if self._verbosity >= VERB_PROGRESS:
sys.stderr.write("\nuploaded chunk {} \n".format(result.index))
self.results.append... | [
"def",
"_handle_result",
"(",
"self",
")",
":",
"result",
"=",
"self",
".",
"inbox",
".",
"get",
"(",
")",
"if",
"result",
".",
"success",
":",
"if",
"self",
".",
"_verbosity",
">=",
"VERB_PROGRESS",
":",
"sys",
".",
"stderr",
".",
"write",
"(",
"\"\... | Process one result. Block untill one is available | [
"Process",
"one",
"result",
".",
"Block",
"untill",
"one",
"is",
"available"
] | 965898cccddd351ce4c56402a215c3bda9f37b5e | https://github.com/presslabs/z3/blob/965898cccddd351ce4c56402a215c3bda9f37b5e/z3/pput.py#L201-L211 | train |
presslabs/z3 | z3/pput.py | UploadSupervisor._send_chunk | def _send_chunk(self, index, chunk):
"""Send the current chunk to the workers for processing.
Called when the _partial_chunk is complete.
Blocks when the outbox is full.
"""
self._pending_chunks += 1
self.outbox.put((index, chunk)) | python | def _send_chunk(self, index, chunk):
"""Send the current chunk to the workers for processing.
Called when the _partial_chunk is complete.
Blocks when the outbox is full.
"""
self._pending_chunks += 1
self.outbox.put((index, chunk)) | [
"def",
"_send_chunk",
"(",
"self",
",",
"index",
",",
"chunk",
")",
":",
"self",
".",
"_pending_chunks",
"+=",
"1",
"self",
".",
"outbox",
".",
"put",
"(",
"(",
"index",
",",
"chunk",
")",
")"
] | Send the current chunk to the workers for processing.
Called when the _partial_chunk is complete.
Blocks when the outbox is full. | [
"Send",
"the",
"current",
"chunk",
"to",
"the",
"workers",
"for",
"processing",
".",
"Called",
"when",
"the",
"_partial_chunk",
"is",
"complete",
"."
] | 965898cccddd351ce4c56402a215c3bda9f37b5e | https://github.com/presslabs/z3/blob/965898cccddd351ce4c56402a215c3bda9f37b5e/z3/pput.py#L220-L227 | train |
presslabs/z3 | z3/config.py | OnionDict._get | def _get(self, key, section=None, default=_onion_dict_guard):
"""Try to get the key from each dict in turn.
If you specify the optional section it looks there first.
"""
if section is not None:
section_dict = self.__sections.get(section, {})
if key in section_dict... | python | def _get(self, key, section=None, default=_onion_dict_guard):
"""Try to get the key from each dict in turn.
If you specify the optional section it looks there first.
"""
if section is not None:
section_dict = self.__sections.get(section, {})
if key in section_dict... | [
"def",
"_get",
"(",
"self",
",",
"key",
",",
"section",
"=",
"None",
",",
"default",
"=",
"_onion_dict_guard",
")",
":",
"if",
"section",
"is",
"not",
"None",
":",
"section_dict",
"=",
"self",
".",
"__sections",
".",
"get",
"(",
"section",
",",
"{",
... | Try to get the key from each dict in turn.
If you specify the optional section it looks there first. | [
"Try",
"to",
"get",
"the",
"key",
"from",
"each",
"dict",
"in",
"turn",
".",
"If",
"you",
"specify",
"the",
"optional",
"section",
"it",
"looks",
"there",
"first",
"."
] | 965898cccddd351ce4c56402a215c3bda9f37b5e | https://github.com/presslabs/z3/blob/965898cccddd351ce4c56402a215c3bda9f37b5e/z3/config.py#L21-L35 | train |
presslabs/z3 | z3/snap.py | ZFSSnapshotManager._parse_snapshots | def _parse_snapshots(self):
"""Returns all snapshots grouped by filesystem, a dict of OrderedDict's
The order of snapshots matters when determining parents for incremental send,
so it's preserved.
Data is indexed by filesystem then for each filesystem we have an OrderedDict
of sn... | python | def _parse_snapshots(self):
"""Returns all snapshots grouped by filesystem, a dict of OrderedDict's
The order of snapshots matters when determining parents for incremental send,
so it's preserved.
Data is indexed by filesystem then for each filesystem we have an OrderedDict
of sn... | [
"def",
"_parse_snapshots",
"(",
"self",
")",
":",
"try",
":",
"snap",
"=",
"self",
".",
"_list_snapshots",
"(",
")",
"except",
"OSError",
"as",
"err",
":",
"logging",
".",
"error",
"(",
"\"unable to list local snapshots!\"",
")",
"return",
"{",
"}",
"vols",
... | Returns all snapshots grouped by filesystem, a dict of OrderedDict's
The order of snapshots matters when determining parents for incremental send,
so it's preserved.
Data is indexed by filesystem then for each filesystem we have an OrderedDict
of snapshots. | [
"Returns",
"all",
"snapshots",
"grouped",
"by",
"filesystem",
"a",
"dict",
"of",
"OrderedDict",
"s",
"The",
"order",
"of",
"snapshots",
"matters",
"when",
"determining",
"parents",
"for",
"incremental",
"send",
"so",
"it",
"s",
"preserved",
".",
"Data",
"is",
... | 965898cccddd351ce4c56402a215c3bda9f37b5e | https://github.com/presslabs/z3/blob/965898cccddd351ce4c56402a215c3bda9f37b5e/z3/snap.py#L176-L202 | train |
presslabs/z3 | z3/snap.py | PairManager._compress | def _compress(self, cmd):
"""Adds the appropriate command to compress the zfs stream"""
compressor = COMPRESSORS.get(self.compressor)
if compressor is None:
return cmd
compress_cmd = compressor['compress']
return "{} | {}".format(compress_cmd, cmd) | python | def _compress(self, cmd):
"""Adds the appropriate command to compress the zfs stream"""
compressor = COMPRESSORS.get(self.compressor)
if compressor is None:
return cmd
compress_cmd = compressor['compress']
return "{} | {}".format(compress_cmd, cmd) | [
"def",
"_compress",
"(",
"self",
",",
"cmd",
")",
":",
"compressor",
"=",
"COMPRESSORS",
".",
"get",
"(",
"self",
".",
"compressor",
")",
"if",
"compressor",
"is",
"None",
":",
"return",
"cmd",
"compress_cmd",
"=",
"compressor",
"[",
"'compress'",
"]",
"... | Adds the appropriate command to compress the zfs stream | [
"Adds",
"the",
"appropriate",
"command",
"to",
"compress",
"the",
"zfs",
"stream"
] | 965898cccddd351ce4c56402a215c3bda9f37b5e | https://github.com/presslabs/z3/blob/965898cccddd351ce4c56402a215c3bda9f37b5e/z3/snap.py#L311-L317 | train |
presslabs/z3 | z3/snap.py | PairManager._decompress | def _decompress(self, cmd, s3_snap):
"""Adds the appropriate command to decompress the zfs stream
This is determined from the metadata of the s3_snap.
"""
compressor = COMPRESSORS.get(s3_snap.compressor)
if compressor is None:
return cmd
decompress_cmd = compr... | python | def _decompress(self, cmd, s3_snap):
"""Adds the appropriate command to decompress the zfs stream
This is determined from the metadata of the s3_snap.
"""
compressor = COMPRESSORS.get(s3_snap.compressor)
if compressor is None:
return cmd
decompress_cmd = compr... | [
"def",
"_decompress",
"(",
"self",
",",
"cmd",
",",
"s3_snap",
")",
":",
"compressor",
"=",
"COMPRESSORS",
".",
"get",
"(",
"s3_snap",
".",
"compressor",
")",
"if",
"compressor",
"is",
"None",
":",
"return",
"cmd",
"decompress_cmd",
"=",
"compressor",
"[",... | Adds the appropriate command to decompress the zfs stream
This is determined from the metadata of the s3_snap. | [
"Adds",
"the",
"appropriate",
"command",
"to",
"decompress",
"the",
"zfs",
"stream",
"This",
"is",
"determined",
"from",
"the",
"metadata",
"of",
"the",
"s3_snap",
"."
] | 965898cccddd351ce4c56402a215c3bda9f37b5e | https://github.com/presslabs/z3/blob/965898cccddd351ce4c56402a215c3bda9f37b5e/z3/snap.py#L319-L327 | train |
presslabs/z3 | z3/snap.py | PairManager.backup_full | def backup_full(self, snap_name=None, dry_run=False):
"""Do a full backup of a snapshot. By default latest local snapshot"""
z_snap = self._snapshot_to_backup(snap_name)
estimated_size = self._parse_estimated_size(
self._cmd.shell(
"zfs send -nvP '{}'".format(z_snap.n... | python | def backup_full(self, snap_name=None, dry_run=False):
"""Do a full backup of a snapshot. By default latest local snapshot"""
z_snap = self._snapshot_to_backup(snap_name)
estimated_size = self._parse_estimated_size(
self._cmd.shell(
"zfs send -nvP '{}'".format(z_snap.n... | [
"def",
"backup_full",
"(",
"self",
",",
"snap_name",
"=",
"None",
",",
"dry_run",
"=",
"False",
")",
":",
"z_snap",
"=",
"self",
".",
"_snapshot_to_backup",
"(",
"snap_name",
")",
"estimated_size",
"=",
"self",
".",
"_parse_estimated_size",
"(",
"self",
".",... | Do a full backup of a snapshot. By default latest local snapshot | [
"Do",
"a",
"full",
"backup",
"of",
"a",
"snapshot",
".",
"By",
"default",
"latest",
"local",
"snapshot"
] | 965898cccddd351ce4c56402a215c3bda9f37b5e | https://github.com/presslabs/z3/blob/965898cccddd351ce4c56402a215c3bda9f37b5e/z3/snap.py#L341-L359 | train |
presslabs/z3 | z3/snap.py | PairManager.backup_incremental | def backup_incremental(self, snap_name=None, dry_run=False):
"""Uploads named snapshot or latest, along with any other snapshots
required for an incremental backup.
"""
z_snap = self._snapshot_to_backup(snap_name)
to_upload = []
current = z_snap
uploaded_meta = []... | python | def backup_incremental(self, snap_name=None, dry_run=False):
"""Uploads named snapshot or latest, along with any other snapshots
required for an incremental backup.
"""
z_snap = self._snapshot_to_backup(snap_name)
to_upload = []
current = z_snap
uploaded_meta = []... | [
"def",
"backup_incremental",
"(",
"self",
",",
"snap_name",
"=",
"None",
",",
"dry_run",
"=",
"False",
")",
":",
"z_snap",
"=",
"self",
".",
"_snapshot_to_backup",
"(",
"snap_name",
")",
"to_upload",
"=",
"[",
"]",
"current",
"=",
"z_snap",
"uploaded_meta",
... | Uploads named snapshot or latest, along with any other snapshots
required for an incremental backup. | [
"Uploads",
"named",
"snapshot",
"or",
"latest",
"along",
"with",
"any",
"other",
"snapshots",
"required",
"for",
"an",
"incremental",
"backup",
"."
] | 965898cccddd351ce4c56402a215c3bda9f37b5e | https://github.com/presslabs/z3/blob/965898cccddd351ce4c56402a215c3bda9f37b5e/z3/snap.py#L361-L403 | train |
pyannote/pyannote-metrics | pyannote/metrics/utils.py | UEMSupportMixin.extrude | def extrude(self, uem, reference, collar=0.0, skip_overlap=False):
"""Extrude reference boundary collars from uem
reference |----| |--------------| |-------------|
uem |---------------------| |-------------------------------|
extruded |--| |--| |---| |-----| |... | python | def extrude(self, uem, reference, collar=0.0, skip_overlap=False):
"""Extrude reference boundary collars from uem
reference |----| |--------------| |-------------|
uem |---------------------| |-------------------------------|
extruded |--| |--| |---| |-----| |... | [
"def",
"extrude",
"(",
"self",
",",
"uem",
",",
"reference",
",",
"collar",
"=",
"0.0",
",",
"skip_overlap",
"=",
"False",
")",
":",
"if",
"collar",
"==",
"0.",
"and",
"not",
"skip_overlap",
":",
"return",
"uem",
"collars",
",",
"overlap_regions",
"=",
... | Extrude reference boundary collars from uem
reference |----| |--------------| |-------------|
uem |---------------------| |-------------------------------|
extruded |--| |--| |---| |-----| |-| |-----| |-----------| |-----|
Parameters
----------
... | [
"Extrude",
"reference",
"boundary",
"collars",
"from",
"uem"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/pyannote/metrics/utils.py#L38-L93 | train |
pyannote/pyannote-metrics | pyannote/metrics/utils.py | UEMSupportMixin.common_timeline | def common_timeline(self, reference, hypothesis):
"""Return timeline common to both reference and hypothesis
reference |--------| |------------| |---------| |----|
hypothesis |--------------| |------| |----------------|
timeline |--|-----|----|---|-|------| |... | python | def common_timeline(self, reference, hypothesis):
"""Return timeline common to both reference and hypothesis
reference |--------| |------------| |---------| |----|
hypothesis |--------------| |------| |----------------|
timeline |--|-----|----|---|-|------| |... | [
"def",
"common_timeline",
"(",
"self",
",",
"reference",
",",
"hypothesis",
")",
":",
"timeline",
"=",
"reference",
".",
"get_timeline",
"(",
"copy",
"=",
"True",
")",
"timeline",
".",
"update",
"(",
"hypothesis",
".",
"get_timeline",
"(",
"copy",
"=",
"Fa... | Return timeline common to both reference and hypothesis
reference |--------| |------------| |---------| |----|
hypothesis |--------------| |------| |----------------|
timeline |--|-----|----|---|-|------| |-|---------|----| |----|
Parameters
-----... | [
"Return",
"timeline",
"common",
"to",
"both",
"reference",
"and",
"hypothesis"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/pyannote/metrics/utils.py#L95-L113 | train |
pyannote/pyannote-metrics | pyannote/metrics/utils.py | UEMSupportMixin.project | def project(self, annotation, timeline):
"""Project annotation onto timeline segments
reference |__A__| |__B__|
|____C____|
timeline |---|---|---| |---|
projection |_A_|_A_|_C_| |_B_|
|_C_|
Parameters
---... | python | def project(self, annotation, timeline):
"""Project annotation onto timeline segments
reference |__A__| |__B__|
|____C____|
timeline |---|---|---| |---|
projection |_A_|_A_|_C_| |_B_|
|_C_|
Parameters
---... | [
"def",
"project",
"(",
"self",
",",
"annotation",
",",
"timeline",
")",
":",
"projection",
"=",
"annotation",
".",
"empty",
"(",
")",
"timeline_",
"=",
"annotation",
".",
"get_timeline",
"(",
"copy",
"=",
"False",
")",
"for",
"segment_",
",",
"segment",
... | Project annotation onto timeline segments
reference |__A__| |__B__|
|____C____|
timeline |---|---|---| |---|
projection |_A_|_A_|_C_| |_B_|
|_C_|
Parameters
----------
annotation : Annotation
time... | [
"Project",
"annotation",
"onto",
"timeline",
"segments"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/pyannote/metrics/utils.py#L115-L141 | train |
pyannote/pyannote-metrics | pyannote/metrics/utils.py | UEMSupportMixin.uemify | def uemify(self, reference, hypothesis, uem=None, collar=0.,
skip_overlap=False, returns_uem=False, returns_timeline=False):
"""Crop 'reference' and 'hypothesis' to 'uem' support
Parameters
----------
reference, hypothesis : Annotation
Reference and hypothesis... | python | def uemify(self, reference, hypothesis, uem=None, collar=0.,
skip_overlap=False, returns_uem=False, returns_timeline=False):
"""Crop 'reference' and 'hypothesis' to 'uem' support
Parameters
----------
reference, hypothesis : Annotation
Reference and hypothesis... | [
"def",
"uemify",
"(",
"self",
",",
"reference",
",",
"hypothesis",
",",
"uem",
"=",
"None",
",",
"collar",
"=",
"0.",
",",
"skip_overlap",
"=",
"False",
",",
"returns_uem",
"=",
"False",
",",
"returns_timeline",
"=",
"False",
")",
":",
"# when uem is not p... | Crop 'reference' and 'hypothesis' to 'uem' support
Parameters
----------
reference, hypothesis : Annotation
Reference and hypothesis annotations.
uem : Timeline, optional
Evaluation map.
collar : float, optional
When provided, set the duration... | [
"Crop",
"reference",
"and",
"hypothesis",
"to",
"uem",
"support"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/pyannote/metrics/utils.py#L143-L210 | train |
pyannote/pyannote-metrics | scripts/pyannote-metrics.py | get_hypothesis | def get_hypothesis(hypotheses, current_file):
"""Get hypothesis for given file
Parameters
----------
hypotheses : `dict`
Speaker diarization hypothesis provided by `load_rttm`.
current_file : `dict`
File description as given by pyannote.database protocols.
Returns
-------
... | python | def get_hypothesis(hypotheses, current_file):
"""Get hypothesis for given file
Parameters
----------
hypotheses : `dict`
Speaker diarization hypothesis provided by `load_rttm`.
current_file : `dict`
File description as given by pyannote.database protocols.
Returns
-------
... | [
"def",
"get_hypothesis",
"(",
"hypotheses",
",",
"current_file",
")",
":",
"uri",
"=",
"current_file",
"[",
"'uri'",
"]",
"if",
"uri",
"in",
"hypotheses",
":",
"return",
"hypotheses",
"[",
"uri",
"]",
"# if the exact 'uri' is not available in hypothesis,",
"# look f... | Get hypothesis for given file
Parameters
----------
hypotheses : `dict`
Speaker diarization hypothesis provided by `load_rttm`.
current_file : `dict`
File description as given by pyannote.database protocols.
Returns
-------
hypothesis : `pyannote.core.Annotation`
Hy... | [
"Get",
"hypothesis",
"for",
"given",
"file"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/scripts/pyannote-metrics.py#L142-L181 | train |
pyannote/pyannote-metrics | scripts/pyannote-metrics.py | reindex | def reindex(report):
"""Reindex report so that 'TOTAL' is the last row"""
index = list(report.index)
i = index.index('TOTAL')
return report.reindex(index[:i] + index[i+1:] + ['TOTAL']) | python | def reindex(report):
"""Reindex report so that 'TOTAL' is the last row"""
index = list(report.index)
i = index.index('TOTAL')
return report.reindex(index[:i] + index[i+1:] + ['TOTAL']) | [
"def",
"reindex",
"(",
"report",
")",
":",
"index",
"=",
"list",
"(",
"report",
".",
"index",
")",
"i",
"=",
"index",
".",
"index",
"(",
"'TOTAL'",
")",
"return",
"report",
".",
"reindex",
"(",
"index",
"[",
":",
"i",
"]",
"+",
"index",
"[",
"i",... | Reindex report so that 'TOTAL' is the last row | [
"Reindex",
"report",
"so",
"that",
"TOTAL",
"is",
"the",
"last",
"row"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/scripts/pyannote-metrics.py#L219-L223 | train |
pyannote/pyannote-metrics | pyannote/metrics/binary_classification.py | precision_recall_curve | def precision_recall_curve(y_true, scores, distances=False):
"""Precision-recall curve
Parameters
----------
y_true : (n_samples, ) array-like
Boolean reference.
scores : (n_samples, ) array-like
Predicted score.
distances : boolean, optional
When True, indicate that `sc... | python | def precision_recall_curve(y_true, scores, distances=False):
"""Precision-recall curve
Parameters
----------
y_true : (n_samples, ) array-like
Boolean reference.
scores : (n_samples, ) array-like
Predicted score.
distances : boolean, optional
When True, indicate that `sc... | [
"def",
"precision_recall_curve",
"(",
"y_true",
",",
"scores",
",",
"distances",
"=",
"False",
")",
":",
"if",
"distances",
":",
"scores",
"=",
"-",
"scores",
"precision",
",",
"recall",
",",
"thresholds",
"=",
"sklearn",
".",
"metrics",
".",
"precision_reca... | Precision-recall curve
Parameters
----------
y_true : (n_samples, ) array-like
Boolean reference.
scores : (n_samples, ) array-like
Predicted score.
distances : boolean, optional
When True, indicate that `scores` are actually `distances`
Returns
-------
precisio... | [
"Precision",
"-",
"recall",
"curve"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/pyannote/metrics/binary_classification.py#L81-L117 | train |
pyannote/pyannote-metrics | pyannote/metrics/errors/identification.py | IdentificationErrorAnalysis.difference | def difference(self, reference, hypothesis, uem=None, uemified=False):
"""Get error analysis as `Annotation`
Labels are (status, reference_label, hypothesis_label) tuples.
`status` is either 'correct', 'confusion', 'missed detection' or
'false alarm'.
`reference_label` is None i... | python | def difference(self, reference, hypothesis, uem=None, uemified=False):
"""Get error analysis as `Annotation`
Labels are (status, reference_label, hypothesis_label) tuples.
`status` is either 'correct', 'confusion', 'missed detection' or
'false alarm'.
`reference_label` is None i... | [
"def",
"difference",
"(",
"self",
",",
"reference",
",",
"hypothesis",
",",
"uem",
"=",
"None",
",",
"uemified",
"=",
"False",
")",
":",
"R",
",",
"H",
",",
"common_timeline",
"=",
"self",
".",
"uemify",
"(",
"reference",
",",
"hypothesis",
",",
"uem",... | Get error analysis as `Annotation`
Labels are (status, reference_label, hypothesis_label) tuples.
`status` is either 'correct', 'confusion', 'missed detection' or
'false alarm'.
`reference_label` is None in case of 'false alarm'.
`hypothesis_label` is None in case of 'missed det... | [
"Get",
"error",
"analysis",
"as",
"Annotation"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/pyannote/metrics/errors/identification.py#L75-L134 | train |
pyannote/pyannote-metrics | pyannote/metrics/base.py | BaseMetric.reset | def reset(self):
"""Reset accumulated components and metric values"""
if self.parallel:
from pyannote.metrics import manager_
self.accumulated_ = manager_.dict()
self.results_ = manager_.list()
self.uris_ = manager_.dict()
else:
self.ac... | python | def reset(self):
"""Reset accumulated components and metric values"""
if self.parallel:
from pyannote.metrics import manager_
self.accumulated_ = manager_.dict()
self.results_ = manager_.list()
self.uris_ = manager_.dict()
else:
self.ac... | [
"def",
"reset",
"(",
"self",
")",
":",
"if",
"self",
".",
"parallel",
":",
"from",
"pyannote",
".",
"metrics",
"import",
"manager_",
"self",
".",
"accumulated_",
"=",
"manager_",
".",
"dict",
"(",
")",
"self",
".",
"results_",
"=",
"manager_",
".",
"li... | Reset accumulated components and metric values | [
"Reset",
"accumulated",
"components",
"and",
"metric",
"values"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/pyannote/metrics/base.py#L76-L88 | train |
pyannote/pyannote-metrics | pyannote/metrics/base.py | BaseMetric.confidence_interval | def confidence_interval(self, alpha=0.9):
"""Compute confidence interval on accumulated metric values
Parameters
----------
alpha : float, optional
Probability that the returned confidence interval contains
the true metric value.
Returns
-------
... | python | def confidence_interval(self, alpha=0.9):
"""Compute confidence interval on accumulated metric values
Parameters
----------
alpha : float, optional
Probability that the returned confidence interval contains
the true metric value.
Returns
-------
... | [
"def",
"confidence_interval",
"(",
"self",
",",
"alpha",
"=",
"0.9",
")",
":",
"m",
",",
"_",
",",
"_",
"=",
"scipy",
".",
"stats",
".",
"bayes_mvs",
"(",
"[",
"r",
"[",
"self",
".",
"metric_name_",
"]",
"for",
"_",
",",
"r",
"in",
"self",
".",
... | Compute confidence interval on accumulated metric values
Parameters
----------
alpha : float, optional
Probability that the returned confidence interval contains
the true metric value.
Returns
-------
(center, (lower, upper))
with cen... | [
"Compute",
"confidence",
"interval",
"on",
"accumulated",
"metric",
"values"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/pyannote/metrics/base.py#L296-L319 | train |
pyannote/pyannote-metrics | pyannote/metrics/base.py | Precision.compute_metric | def compute_metric(self, components):
"""Compute precision from `components`"""
numerator = components[PRECISION_RELEVANT_RETRIEVED]
denominator = components[PRECISION_RETRIEVED]
if denominator == 0.:
if numerator == 0:
return 1.
else:
... | python | def compute_metric(self, components):
"""Compute precision from `components`"""
numerator = components[PRECISION_RELEVANT_RETRIEVED]
denominator = components[PRECISION_RETRIEVED]
if denominator == 0.:
if numerator == 0:
return 1.
else:
... | [
"def",
"compute_metric",
"(",
"self",
",",
"components",
")",
":",
"numerator",
"=",
"components",
"[",
"PRECISION_RELEVANT_RETRIEVED",
"]",
"denominator",
"=",
"components",
"[",
"PRECISION_RETRIEVED",
"]",
"if",
"denominator",
"==",
"0.",
":",
"if",
"numerator",... | Compute precision from `components` | [
"Compute",
"precision",
"from",
"components"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/pyannote/metrics/base.py#L347-L357 | train |
pyannote/pyannote-metrics | pyannote/metrics/base.py | Recall.compute_metric | def compute_metric(self, components):
"""Compute recall from `components`"""
numerator = components[RECALL_RELEVANT_RETRIEVED]
denominator = components[RECALL_RELEVANT]
if denominator == 0.:
if numerator == 0:
return 1.
else:
raise ... | python | def compute_metric(self, components):
"""Compute recall from `components`"""
numerator = components[RECALL_RELEVANT_RETRIEVED]
denominator = components[RECALL_RELEVANT]
if denominator == 0.:
if numerator == 0:
return 1.
else:
raise ... | [
"def",
"compute_metric",
"(",
"self",
",",
"components",
")",
":",
"numerator",
"=",
"components",
"[",
"RECALL_RELEVANT_RETRIEVED",
"]",
"denominator",
"=",
"components",
"[",
"RECALL_RELEVANT",
"]",
"if",
"denominator",
"==",
"0.",
":",
"if",
"numerator",
"=="... | Compute recall from `components` | [
"Compute",
"recall",
"from",
"components"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/pyannote/metrics/base.py#L384-L394 | train |
pyannote/pyannote-metrics | pyannote/metrics/diarization.py | DiarizationErrorRate.optimal_mapping | def optimal_mapping(self, reference, hypothesis, uem=None):
"""Optimal label mapping
Parameters
----------
reference : Annotation
hypothesis : Annotation
Reference and hypothesis diarization
uem : Timeline
Evaluation map
Returns
-... | python | def optimal_mapping(self, reference, hypothesis, uem=None):
"""Optimal label mapping
Parameters
----------
reference : Annotation
hypothesis : Annotation
Reference and hypothesis diarization
uem : Timeline
Evaluation map
Returns
-... | [
"def",
"optimal_mapping",
"(",
"self",
",",
"reference",
",",
"hypothesis",
",",
"uem",
"=",
"None",
")",
":",
"# NOTE that this 'uemification' will not be called when",
"# 'optimal_mapping' is called from 'compute_components' as it",
"# has already been done in 'compute_components'"... | Optimal label mapping
Parameters
----------
reference : Annotation
hypothesis : Annotation
Reference and hypothesis diarization
uem : Timeline
Evaluation map
Returns
-------
mapping : dict
Mapping between hypothesis (k... | [
"Optimal",
"label",
"mapping"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/pyannote/metrics/diarization.py#L106-L131 | train |
pyannote/pyannote-metrics | pyannote/metrics/diarization.py | GreedyDiarizationErrorRate.greedy_mapping | def greedy_mapping(self, reference, hypothesis, uem=None):
"""Greedy label mapping
Parameters
----------
reference : Annotation
hypothesis : Annotation
Reference and hypothesis diarization
uem : Timeline
Evaluation map
Returns
---... | python | def greedy_mapping(self, reference, hypothesis, uem=None):
"""Greedy label mapping
Parameters
----------
reference : Annotation
hypothesis : Annotation
Reference and hypothesis diarization
uem : Timeline
Evaluation map
Returns
---... | [
"def",
"greedy_mapping",
"(",
"self",
",",
"reference",
",",
"hypothesis",
",",
"uem",
"=",
"None",
")",
":",
"if",
"uem",
":",
"reference",
",",
"hypothesis",
"=",
"self",
".",
"uemify",
"(",
"reference",
",",
"hypothesis",
",",
"uem",
"=",
"uem",
")"... | Greedy label mapping
Parameters
----------
reference : Annotation
hypothesis : Annotation
Reference and hypothesis diarization
uem : Timeline
Evaluation map
Returns
-------
mapping : dict
Mapping between hypothesis (ke... | [
"Greedy",
"label",
"mapping"
] | b433fec3bd37ca36fe026a428cd72483d646871a | https://github.com/pyannote/pyannote-metrics/blob/b433fec3bd37ca36fe026a428cd72483d646871a/pyannote/metrics/diarization.py#L223-L241 | train |
brian-rose/climlab | climlab/radiation/radiation.py | default_absorbers | def default_absorbers(Tatm,
ozone_file = 'apeozone_cam3_5_54.nc',
verbose = True,):
'''Initialize a dictionary of well-mixed radiatively active gases
All values are volumetric mixing ratios.
Ozone is set to a climatology.
All other gases are assumed well-mixed:
... | python | def default_absorbers(Tatm,
ozone_file = 'apeozone_cam3_5_54.nc',
verbose = True,):
'''Initialize a dictionary of well-mixed radiatively active gases
All values are volumetric mixing ratios.
Ozone is set to a climatology.
All other gases are assumed well-mixed:
... | [
"def",
"default_absorbers",
"(",
"Tatm",
",",
"ozone_file",
"=",
"'apeozone_cam3_5_54.nc'",
",",
"verbose",
"=",
"True",
",",
")",
":",
"absorber_vmr",
"=",
"{",
"}",
"absorber_vmr",
"[",
"'CO2'",
"]",
"=",
"348.",
"/",
"1E6",
"absorber_vmr",
"[",
"'CH4'",
... | Initialize a dictionary of well-mixed radiatively active gases
All values are volumetric mixing ratios.
Ozone is set to a climatology.
All other gases are assumed well-mixed:
- CO2
- CH4
- N2O
- O2
- CFC11
- CFC12
- CFC22
- CCL4
Specifi... | [
"Initialize",
"a",
"dictionary",
"of",
"well",
"-",
"mixed",
"radiatively",
"active",
"gases",
"All",
"values",
"are",
"volumetric",
"mixing",
"ratios",
"."
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/radiation/radiation.py#L98-L166 | train |
brian-rose/climlab | climlab/radiation/radiation.py | init_interface | def init_interface(field):
'''Return a Field object defined at the vertical interfaces of the input Field object.'''
interface_shape = np.array(field.shape); interface_shape[-1] += 1
interfaces = np.tile(False,len(interface_shape)); interfaces[-1] = True
interface_zero = Field(np.zeros(interface_shape),... | python | def init_interface(field):
'''Return a Field object defined at the vertical interfaces of the input Field object.'''
interface_shape = np.array(field.shape); interface_shape[-1] += 1
interfaces = np.tile(False,len(interface_shape)); interfaces[-1] = True
interface_zero = Field(np.zeros(interface_shape),... | [
"def",
"init_interface",
"(",
"field",
")",
":",
"interface_shape",
"=",
"np",
".",
"array",
"(",
"field",
".",
"shape",
")",
"interface_shape",
"[",
"-",
"1",
"]",
"+=",
"1",
"interfaces",
"=",
"np",
".",
"tile",
"(",
"False",
",",
"len",
"(",
"inte... | Return a Field object defined at the vertical interfaces of the input Field object. | [
"Return",
"a",
"Field",
"object",
"defined",
"at",
"the",
"vertical",
"interfaces",
"of",
"the",
"input",
"Field",
"object",
"."
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/radiation/radiation.py#L168-L173 | train |
brian-rose/climlab | climlab/convection/akmaev_adjustment.py | convective_adjustment_direct | def convective_adjustment_direct(p, T, c, lapserate=6.5):
"""Convective Adjustment to a specified lapse rate.
Input argument lapserate gives the lapse rate expressed in degrees K per km
(positive means temperature increasing downward).
Default lapse rate is 6.5 K / km.
Returns the adjusted Column... | python | def convective_adjustment_direct(p, T, c, lapserate=6.5):
"""Convective Adjustment to a specified lapse rate.
Input argument lapserate gives the lapse rate expressed in degrees K per km
(positive means temperature increasing downward).
Default lapse rate is 6.5 K / km.
Returns the adjusted Column... | [
"def",
"convective_adjustment_direct",
"(",
"p",
",",
"T",
",",
"c",
",",
"lapserate",
"=",
"6.5",
")",
":",
"# largely follows notation and algorithm in Akmaev (1991) MWR",
"alpha",
"=",
"const",
".",
"Rd",
"/",
"const",
".",
"g",
"*",
"lapserate",
"/",
"1.E3",... | Convective Adjustment to a specified lapse rate.
Input argument lapserate gives the lapse rate expressed in degrees K per km
(positive means temperature increasing downward).
Default lapse rate is 6.5 K / km.
Returns the adjusted Column temperature.
inputs:
p is pressure in hPa
T is tempe... | [
"Convective",
"Adjustment",
"to",
"a",
"specified",
"lapse",
"rate",
"."
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/convection/akmaev_adjustment.py#L7-L39 | train |
brian-rose/climlab | climlab/convection/akmaev_adjustment.py | Akmaev_adjustment | def Akmaev_adjustment(theta, q, beta, n_k, theta_k, s_k, t_k):
'''Single column only.'''
L = q.size # number of vertical levels
# Akmaev step 1
k = 1
n_k[k-1] = 1
theta_k[k-1] = theta[k-1]
l = 2
while True:
# Akmaev step 2
n = 1
thistheta = theta[l-1]
whi... | python | def Akmaev_adjustment(theta, q, beta, n_k, theta_k, s_k, t_k):
'''Single column only.'''
L = q.size # number of vertical levels
# Akmaev step 1
k = 1
n_k[k-1] = 1
theta_k[k-1] = theta[k-1]
l = 2
while True:
# Akmaev step 2
n = 1
thistheta = theta[l-1]
whi... | [
"def",
"Akmaev_adjustment",
"(",
"theta",
",",
"q",
",",
"beta",
",",
"n_k",
",",
"theta_k",
",",
"s_k",
",",
"t_k",
")",
":",
"L",
"=",
"q",
".",
"size",
"# number of vertical levels",
"# Akmaev step 1",
"k",
"=",
"1",
"n_k",
"[",
"k",
"-",
"1",
"]"... | Single column only. | [
"Single",
"column",
"only",
"."
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/convection/akmaev_adjustment.py#L58-L129 | train |
brian-rose/climlab | climlab/model/column.py | GreyRadiationModel.do_diagnostics | def do_diagnostics(self):
'''Set all the diagnostics from long and shortwave radiation.'''
self.OLR = self.subprocess['LW'].flux_to_space
self.LW_down_sfc = self.subprocess['LW'].flux_to_sfc
self.LW_up_sfc = self.subprocess['LW'].flux_from_sfc
self.LW_absorbed_sfc = self.LW_down_... | python | def do_diagnostics(self):
'''Set all the diagnostics from long and shortwave radiation.'''
self.OLR = self.subprocess['LW'].flux_to_space
self.LW_down_sfc = self.subprocess['LW'].flux_to_sfc
self.LW_up_sfc = self.subprocess['LW'].flux_from_sfc
self.LW_absorbed_sfc = self.LW_down_... | [
"def",
"do_diagnostics",
"(",
"self",
")",
":",
"self",
".",
"OLR",
"=",
"self",
".",
"subprocess",
"[",
"'LW'",
"]",
".",
"flux_to_space",
"self",
".",
"LW_down_sfc",
"=",
"self",
".",
"subprocess",
"[",
"'LW'",
"]",
".",
"flux_to_sfc",
"self",
".",
"... | Set all the diagnostics from long and shortwave radiation. | [
"Set",
"all",
"the",
"diagnostics",
"from",
"long",
"and",
"shortwave",
"radiation",
"."
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/model/column.py#L119-L141 | train |
brian-rose/climlab | climlab/utils/thermo.py | clausius_clapeyron | def clausius_clapeyron(T):
"""Compute saturation vapor pressure as function of temperature T.
Input: T is temperature in Kelvin
Output: saturation vapor pressure in mb or hPa
Formula from Rogers and Yau "A Short Course in Cloud Physics" (Pergammon Press), p. 16
claimed to be accurate to within 0.1... | python | def clausius_clapeyron(T):
"""Compute saturation vapor pressure as function of temperature T.
Input: T is temperature in Kelvin
Output: saturation vapor pressure in mb or hPa
Formula from Rogers and Yau "A Short Course in Cloud Physics" (Pergammon Press), p. 16
claimed to be accurate to within 0.1... | [
"def",
"clausius_clapeyron",
"(",
"T",
")",
":",
"Tcel",
"=",
"T",
"-",
"tempCtoK",
"es",
"=",
"6.112",
"*",
"exp",
"(",
"17.67",
"*",
"Tcel",
"/",
"(",
"Tcel",
"+",
"243.5",
")",
")",
"return",
"es"
] | Compute saturation vapor pressure as function of temperature T.
Input: T is temperature in Kelvin
Output: saturation vapor pressure in mb or hPa
Formula from Rogers and Yau "A Short Course in Cloud Physics" (Pergammon Press), p. 16
claimed to be accurate to within 0.1% between -30degC and 35 degC
... | [
"Compute",
"saturation",
"vapor",
"pressure",
"as",
"function",
"of",
"temperature",
"T",
"."
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/utils/thermo.py#L41-L54 | train |
brian-rose/climlab | climlab/utils/thermo.py | qsat | def qsat(T,p):
"""Compute saturation specific humidity as function of temperature and pressure.
Input: T is temperature in Kelvin
p is pressure in hPa or mb
Output: saturation specific humidity (dimensionless).
"""
es = clausius_clapeyron(T)
q = eps * es / (p - (1 - eps) * es )
... | python | def qsat(T,p):
"""Compute saturation specific humidity as function of temperature and pressure.
Input: T is temperature in Kelvin
p is pressure in hPa or mb
Output: saturation specific humidity (dimensionless).
"""
es = clausius_clapeyron(T)
q = eps * es / (p - (1 - eps) * es )
... | [
"def",
"qsat",
"(",
"T",
",",
"p",
")",
":",
"es",
"=",
"clausius_clapeyron",
"(",
"T",
")",
"q",
"=",
"eps",
"*",
"es",
"/",
"(",
"p",
"-",
"(",
"1",
"-",
"eps",
")",
"*",
"es",
")",
"return",
"q"
] | Compute saturation specific humidity as function of temperature and pressure.
Input: T is temperature in Kelvin
p is pressure in hPa or mb
Output: saturation specific humidity (dimensionless). | [
"Compute",
"saturation",
"specific",
"humidity",
"as",
"function",
"of",
"temperature",
"and",
"pressure",
"."
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/utils/thermo.py#L56-L66 | train |
brian-rose/climlab | climlab/utils/thermo.py | pseudoadiabat | def pseudoadiabat(T,p):
"""Compute the local slope of the pseudoadiabat at given temperature and pressure
Inputs: p is pressure in hPa or mb
T is local temperature in Kelvin
Output: dT/dp, the rate of temperature change for pseudoadiabatic ascent
in units of K / hPa
T... | python | def pseudoadiabat(T,p):
"""Compute the local slope of the pseudoadiabat at given temperature and pressure
Inputs: p is pressure in hPa or mb
T is local temperature in Kelvin
Output: dT/dp, the rate of temperature change for pseudoadiabatic ascent
in units of K / hPa
T... | [
"def",
"pseudoadiabat",
"(",
"T",
",",
"p",
")",
":",
"esoverp",
"=",
"clausius_clapeyron",
"(",
"T",
")",
"/",
"p",
"Tcel",
"=",
"T",
"-",
"tempCtoK",
"L",
"=",
"(",
"2.501",
"-",
"0.00237",
"*",
"Tcel",
")",
"*",
"1.E6",
"# Accurate form of latent he... | Compute the local slope of the pseudoadiabat at given temperature and pressure
Inputs: p is pressure in hPa or mb
T is local temperature in Kelvin
Output: dT/dp, the rate of temperature change for pseudoadiabatic ascent
in units of K / hPa
The pseudoadiabat describes chan... | [
"Compute",
"the",
"local",
"slope",
"of",
"the",
"pseudoadiabat",
"at",
"given",
"temperature",
"and",
"pressure"
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/utils/thermo.py#L101-L124 | train |
brian-rose/climlab | climlab/dynamics/diffusion.py | _solve_implicit_banded | def _solve_implicit_banded(current, banded_matrix):
"""Uses a banded solver for matrix inversion of a tridiagonal matrix.
Converts the complete listed tridiagonal matrix *(nxn)* into a three row
matrix *(3xn)* and calls :py:func:`scipy.linalg.solve_banded()`.
:param array current: the curren... | python | def _solve_implicit_banded(current, banded_matrix):
"""Uses a banded solver for matrix inversion of a tridiagonal matrix.
Converts the complete listed tridiagonal matrix *(nxn)* into a three row
matrix *(3xn)* and calls :py:func:`scipy.linalg.solve_banded()`.
:param array current: the curren... | [
"def",
"_solve_implicit_banded",
"(",
"current",
",",
"banded_matrix",
")",
":",
"# can improve performance by storing the banded form once and not",
"# recalculating it...",
"# but whatever",
"J",
"=",
"banded_matrix",
".",
"shape",
"[",
"0",
"]",
"diag",
"=",
"np",
"... | Uses a banded solver for matrix inversion of a tridiagonal matrix.
Converts the complete listed tridiagonal matrix *(nxn)* into a three row
matrix *(3xn)* and calls :py:func:`scipy.linalg.solve_banded()`.
:param array current: the current state of the variable for which
... | [
"Uses",
"a",
"banded",
"solver",
"for",
"matrix",
"inversion",
"of",
"a",
"tridiagonal",
"matrix",
"."
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/dynamics/diffusion.py#L360-L381 | train |
brian-rose/climlab | climlab/dynamics/diffusion.py | _guess_diffusion_axis | def _guess_diffusion_axis(process_or_domain):
"""Scans given process, domain or dictionary of domains for a diffusion axis
and returns appropriate name.
In case only one axis with length > 1 in the process or set of domains
exists, the name of that axis is returned. Otherwise an error is raised.
:... | python | def _guess_diffusion_axis(process_or_domain):
"""Scans given process, domain or dictionary of domains for a diffusion axis
and returns appropriate name.
In case only one axis with length > 1 in the process or set of domains
exists, the name of that axis is returned. Otherwise an error is raised.
:... | [
"def",
"_guess_diffusion_axis",
"(",
"process_or_domain",
")",
":",
"axes",
"=",
"get_axes",
"(",
"process_or_domain",
")",
"diff_ax",
"=",
"{",
"}",
"for",
"axname",
",",
"ax",
"in",
"axes",
".",
"items",
"(",
")",
":",
"if",
"ax",
".",
"num_points",
">... | Scans given process, domain or dictionary of domains for a diffusion axis
and returns appropriate name.
In case only one axis with length > 1 in the process or set of domains
exists, the name of that axis is returned. Otherwise an error is raised.
:param process_or_domain: input from where diffusion... | [
"Scans",
"given",
"process",
"domain",
"or",
"dictionary",
"of",
"domains",
"for",
"a",
"diffusion",
"axis",
"and",
"returns",
"appropriate",
"name",
"."
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/dynamics/diffusion.py#L384-L408 | train |
brian-rose/climlab | climlab/dynamics/diffusion.py | Diffusion._implicit_solver | def _implicit_solver(self):
"""Invertes and solves the matrix problem for diffusion matrix
and temperature T.
The method is called by the
:func:`~climlab.process.implicit.ImplicitProcess._compute()` function
of the :class:`~climlab.process.implicit.ImplicitProcess` class and
... | python | def _implicit_solver(self):
"""Invertes and solves the matrix problem for diffusion matrix
and temperature T.
The method is called by the
:func:`~climlab.process.implicit.ImplicitProcess._compute()` function
of the :class:`~climlab.process.implicit.ImplicitProcess` class and
... | [
"def",
"_implicit_solver",
"(",
"self",
")",
":",
"#if self.update_diffusivity:",
"# Time-stepping the diffusion is just inverting this matrix problem:",
"newstate",
"=",
"{",
"}",
"for",
"varname",
",",
"value",
"in",
"self",
".",
"state",
".",
"items",
"(",
")",
":"... | Invertes and solves the matrix problem for diffusion matrix
and temperature T.
The method is called by the
:func:`~climlab.process.implicit.ImplicitProcess._compute()` function
of the :class:`~climlab.process.implicit.ImplicitProcess` class and
solves the matrix problem
... | [
"Invertes",
"and",
"solves",
"the",
"matrix",
"problem",
"for",
"diffusion",
"matrix",
"and",
"temperature",
"T",
"."
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/dynamics/diffusion.py#L143-L192 | train |
brian-rose/climlab | climlab/surface/albedo.py | P2Albedo._compute_fixed | def _compute_fixed(self):
'''Recompute any fixed quantities after a change in parameters'''
try:
lon, lat = np.meshgrid(self.lon, self.lat)
except:
lat = self.lat
phi = np.deg2rad(lat)
try:
albedo = self.a0 + self.a2 * P2(np.sin(phi))
e... | python | def _compute_fixed(self):
'''Recompute any fixed quantities after a change in parameters'''
try:
lon, lat = np.meshgrid(self.lon, self.lat)
except:
lat = self.lat
phi = np.deg2rad(lat)
try:
albedo = self.a0 + self.a2 * P2(np.sin(phi))
e... | [
"def",
"_compute_fixed",
"(",
"self",
")",
":",
"try",
":",
"lon",
",",
"lat",
"=",
"np",
".",
"meshgrid",
"(",
"self",
".",
"lon",
",",
"self",
".",
"lat",
")",
"except",
":",
"lat",
"=",
"self",
".",
"lat",
"phi",
"=",
"np",
".",
"deg2rad",
"... | Recompute any fixed quantities after a change in parameters | [
"Recompute",
"any",
"fixed",
"quantities",
"after",
"a",
"change",
"in",
"parameters"
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/surface/albedo.py#L179-L194 | train |
brian-rose/climlab | climlab/surface/albedo.py | Iceline.find_icelines | def find_icelines(self):
"""Finds iceline according to the surface temperature.
This method is called by the private function
:func:`~climlab.surface.albedo.Iceline._compute`
and updates following attributes according to the freezing temperature
``self.param['Tf']`` and the surf... | python | def find_icelines(self):
"""Finds iceline according to the surface temperature.
This method is called by the private function
:func:`~climlab.surface.albedo.Iceline._compute`
and updates following attributes according to the freezing temperature
``self.param['Tf']`` and the surf... | [
"def",
"find_icelines",
"(",
"self",
")",
":",
"Tf",
"=",
"self",
".",
"param",
"[",
"'Tf'",
"]",
"Ts",
"=",
"self",
".",
"state",
"[",
"'Ts'",
"]",
"lat_bounds",
"=",
"self",
".",
"domains",
"[",
"'Ts'",
"]",
".",
"axes",
"[",
"'lat'",
"]",
".",... | Finds iceline according to the surface temperature.
This method is called by the private function
:func:`~climlab.surface.albedo.Iceline._compute`
and updates following attributes according to the freezing temperature
``self.param['Tf']`` and the surface temperature ``self.param['Ts']``... | [
"Finds",
"iceline",
"according",
"to",
"the",
"surface",
"temperature",
"."
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/surface/albedo.py#L236-L291 | train |
brian-rose/climlab | climlab/surface/albedo.py | StepFunctionAlbedo._get_current_albedo | def _get_current_albedo(self):
'''Simple step-function albedo based on ice line at temperature Tf.'''
ice = self.subprocess['iceline'].ice
# noice = self.subprocess['iceline'].diagnostics['noice']
cold_albedo = self.subprocess['cold_albedo'].albedo
warm_albedo = self.subprocess['... | python | def _get_current_albedo(self):
'''Simple step-function albedo based on ice line at temperature Tf.'''
ice = self.subprocess['iceline'].ice
# noice = self.subprocess['iceline'].diagnostics['noice']
cold_albedo = self.subprocess['cold_albedo'].albedo
warm_albedo = self.subprocess['... | [
"def",
"_get_current_albedo",
"(",
"self",
")",
":",
"ice",
"=",
"self",
".",
"subprocess",
"[",
"'iceline'",
"]",
".",
"ice",
"# noice = self.subprocess['iceline'].diagnostics['noice']",
"cold_albedo",
"=",
"self",
".",
"subprocess",
"[",
"'cold_albedo'",
"]",
".",... | Simple step-function albedo based on ice line at temperature Tf. | [
"Simple",
"step",
"-",
"function",
"albedo",
"based",
"on",
"ice",
"line",
"at",
"temperature",
"Tf",
"."
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/surface/albedo.py#L374-L381 | train |
brian-rose/climlab | climlab/process/process.py | process_like | def process_like(proc):
"""Make an exact clone of a process, including state and all subprocesses.
The creation date is updated.
:param proc: process
:type proc: :class:`~climlab.process.process.Process`
:return: new process identical to the given process
:rtype: :class:`... | python | def process_like(proc):
"""Make an exact clone of a process, including state and all subprocesses.
The creation date is updated.
:param proc: process
:type proc: :class:`~climlab.process.process.Process`
:return: new process identical to the given process
:rtype: :class:`... | [
"def",
"process_like",
"(",
"proc",
")",
":",
"newproc",
"=",
"copy",
".",
"deepcopy",
"(",
"proc",
")",
"newproc",
".",
"creation_date",
"=",
"time",
".",
"strftime",
"(",
"\"%a, %d %b %Y %H:%M:%S %z\"",
",",
"time",
".",
"localtime",
"(",
")",
")",
"retu... | Make an exact clone of a process, including state and all subprocesses.
The creation date is updated.
:param proc: process
:type proc: :class:`~climlab.process.process.Process`
:return: new process identical to the given process
:rtype: :class:`~climlab.process.process.Proces... | [
"Make",
"an",
"exact",
"clone",
"of",
"a",
"process",
"including",
"state",
"and",
"all",
"subprocesses",
"."
] | eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6 | https://github.com/brian-rose/climlab/blob/eae188a2ae9308229b8cbb8fe0b65f51b50ee1e6/climlab/process/process.py#L783-L817 | train |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.