Skip to content

BaseOntology

Here we provide base data models that can be used to define new ontologies. The base data models are classes which inherit from pydantic.BaseModel and define fields as annotated attributes.

T module-attribute

T = TypeVar('T')

A type variable to represent any type in Python. This is used as placeholder for any concept in the ontologies.

KnowledgeGraph

Bases: BaseModel

This class is used to represent a knowledge graph consists of Pydantic objects in the Python memory.

Attributes:

Name Type Description
ontology_lookup Dict[str, BaseOntology]

A class variable to store the lookup dictionary of ontologies

class_lookup Dict[str, BaseClass]

A class variable to store the lookup dictionary of classes

property_lookup Dict[str, BaseProperty]

A class variable to store the lookup dictionary of properties

graph classmethod

graph() -> Graph

This method is used to retrieve the knowledge graph in Python memory.

Returns:

Name Type Description
Graph Graph

The rdflib.Graph object of the knowledge graph

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def graph(cls) -> Graph:
    """
    This method is used to retrieve the knowledge graph in Python memory.

    Returns:
        Graph: The rdflib.Graph object of the knowledge graph
    """
    g = Graph()
    for iri, o in cls.construct_object_lookup().items():
        g += o.graph()
    return g

all_triples_of_nodes classmethod

all_triples_of_nodes(iris: Union[str, list]) -> Graph

This method is used to retrieve all (in-coming and out-going) triples of the given nodes in the knowledge graph.

Parameters:

Name Type Description Default
iris str or list

The IRI of the nodes to be retrieved

required

Returns:

Name Type Description
Graph Graph

The rdflib.Graph object of the triples of the given nodes

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def all_triples_of_nodes(cls, iris: Union[str, list]) -> Graph:
    """
    This method is used to retrieve all (in-coming and out-going) triples of the given nodes in the knowledge graph.

    Args:
        iris (str or list): The IRI of the nodes to be retrieved

    Returns:
        Graph: The rdflib.Graph object of the triples of the given nodes
    """
    # ensure iris is a list
    if isinstance(iris, str):
        iris = [iris]

    # convert strings to URIRef if necessary
    iris = [URIRef(iri) if isinstance(iri, str) else iri for iri in iris]

    source_g = cls.graph()
    result_g = Graph()

    # add triples to result_graph
    for iri in iris:
        for triple in source_g.triples((iri, None, None)):
            result_g.add(triple)
        for triple in source_g.triples((None, None, iri)):
            result_g.add(triple)
    return result_g

construct_object_lookup classmethod

construct_object_lookup() -> Dict[str, BaseClass]

This method is used to retrieve all BaseClass (pydantic) objects created in Python memory.

Returns:

Type Description
Dict[str, BaseClass]

A dictionary of BaseClass (pydantic) objects with their IRIs as keys

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def construct_object_lookup(cls) -> Dict[str, BaseClass]:
    """
    This method is used to retrieve all BaseClass (pydantic) objects created in Python memory.

    Returns:
        A dictionary of BaseClass (pydantic) objects with their IRIs as keys
    """
    if cls.class_lookup is None:
        return {}
    return {i: o for clz in cls.class_lookup.values() if bool(clz.object_lookup) for i, o in clz.object_lookup.items()}

get_object_from_lookup classmethod

get_object_from_lookup(iri: str) -> Union[BaseClass, None]

This method is used to retrieve an object from Python memory given its IRI.

Parameters:

Name Type Description Default
iri str

IRI of the object to be retrieved

required

Returns:

Type Description
Union[BaseClass, None]

The pydantic object of the given IRI if exist, otherwise return None.

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def get_object_from_lookup(cls, iri: str) -> Union[BaseClass, None]:
    """
    This method is used to retrieve an object from Python memory given its IRI.

    Args:
        iri (str): IRI of the object to be retrieved

    Returns:
        The pydantic object of the given IRI if exist, otherwise return None.
    """
    return cls.construct_object_lookup().get(iri, None)

clear_object_lookup classmethod

clear_object_lookup()

This method is used to clear the object lookup dictionary in Python memory.

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def clear_object_lookup(cls):
    """ This method is used to clear the object lookup dictionary in Python memory. """
    for cls in cls.class_lookup.values():
        cls.clear_object_lookup()

BaseOntology

Bases: BaseModel

This class is used to represent an ontology which consists of a list of BaseClass and ObjectProperty/DatatypeProperty.

Attributes:

Name Type Description
base_url str

The base URL to be used to construct the namespace IRI, the default value is 'https://www.theworldavatar.com/kg/'

namespace str

The namespace of the ontology, e.g. 'ontolab'

namespace_iri str

The namespace IRI of the ontology, e.g. 'https://www.theworldavatar.com/kg/ontolab'

class_lookup Dict[str, BaseClass]

A dictionary of BaseClass classes with their rdf:type as keys

object_property_lookup Dict[str, ObjectProperty]

A dictionary of ObjectProperty classes with their predicate IRI as keys

data_property_lookup Dict[str, DatatypeProperty]

A dictionary of DatatypeProperty classes with their predicate IRI as keys

rdfs_comment Set[str]

The comment of the ontology

owl_versionInfo str

The version of the ontology

is_dev_mode classmethod

is_dev_mode()

This method returns whether the KnowledgeGraph is in development mode.

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def is_dev_mode(cls):
    """This method returns whether the KnowledgeGraph is in development mode."""
    return cls._dev_mode

set_dev_mode classmethod

set_dev_mode()

This method sets the KnowledgeGraph to development mode, where duplicate class or property registration will be allowed that the existing ones will be overwritten.

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def set_dev_mode(cls):
    """This method sets the KnowledgeGraph to development mode, where duplicate class or property registration will be allowed that the existing ones will be overwritten."""
    cls._dev_mode = True

set_prod_mode classmethod

set_prod_mode()

This method sets the KnowledgeGraph to production mode, where duplicate class or property registration will raise an error.

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def set_prod_mode(cls):
    """This method sets the KnowledgeGraph to production mode, where duplicate class or property registration will raise an error."""
    cls._dev_mode = False

export_to_graph classmethod

export_to_graph(g: Graph = None) -> Graph

This method is used to export the ontology to a rdflib.Graph object. It operates at the TBox level, i.e. it only exports the classes and properties of the ontology.

Parameters:

Name Type Description Default
g Graph

The rdflib.Graph object to which the ontology will be exported

None
Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def export_to_graph(cls, g: Graph = None) -> Graph:
    """
    This method is used to export the ontology to a rdflib.Graph object.
    It operates at the TBox level, i.e. it only exports the classes and properties of the ontology.

    Args:
        g (Graph): The rdflib.Graph object to which the ontology will be exported
    """
    if g is None:
        g = Graph()
    # metadata
    g.add((URIRef(cls.namespace_iri), RDF.type, OWL.Ontology))
    g.add((URIRef(cls.namespace_iri), DC.date, Literal(datetime.now().isoformat())))
    if bool(cls.rdfs_comment):
        if isinstance(cls.rdfs_comment, str):
            g.add((URIRef(cls.namespace_iri), RDFS.comment, Literal(cls.rdfs_comment)))
        elif isinstance(cls.rdfs_comment, set):
            for comment in cls.rdfs_comment:
                g.add((URIRef(cls.namespace_iri), RDFS.comment, Literal(comment)))
    if bool(cls.owl_versionInfo):
        g.add((URIRef(cls.namespace_iri), OWL.versionInfo, Literal(cls.owl_versionInfo)))
    # handle all classes
    if bool(cls.class_lookup):
        for clz in cls.class_lookup.values():
            g = clz._export_to_owl(g)
    # handle all object and data properties
    property_domain_range_lookup = KnowledgeGraph._construct_property_domain_range_lookup()
    if bool(cls.object_property_lookup):
        for prop in cls.object_property_lookup.values():
            g = prop._export_to_owl(
                g,
                property_domain_range_lookup.get(prop.predicate_iri, {'rdfs_domain': set()})['rdfs_domain'],
                property_domain_range_lookup.get(prop.predicate_iri, {'rdfs_range': set()})['rdfs_range'],
            )
    # handle all data properties
    if bool(cls.data_property_lookup):
        for prop in cls.data_property_lookup.values():
            g = prop._export_to_owl(
                g,
                property_domain_range_lookup.get(prop.predicate_iri, {'rdfs_domain': set()})['rdfs_domain'],
                property_domain_range_lookup.get(prop.predicate_iri, {'rdfs_range': set()})['rdfs_range'],
            )

    return g

export_to_triple_store classmethod

export_to_triple_store(sparql_client: PySparqlClient)

This method is used to export the ontology to a triplestore. It operates at the TBox level, i.e. it only exports the classes and properties of the ontology.

Parameters:

Name Type Description Default
sparql_client PySparqlClient

The PySparqlClient object that connects to the triplestore

required
Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def export_to_triple_store(cls, sparql_client: PySparqlClient):
    """
    This method is used to export the ontology to a triplestore.
    It operates at the TBox level, i.e. it only exports the classes and properties of the ontology.

    Args:
        sparql_client (PySparqlClient): The PySparqlClient object that connects to the triplestore
    """
    g = cls.export_to_graph()

    # upload to triplestore
    sparql_client.upload_graph(g)

export_to_owl classmethod

export_to_owl(file_path: str, format: str = 'ttl')

This method is used to export the ontology to an ontology file. It operates at the TBox level, i.e. it only exports the classes and properties of the ontology.

Parameters:

Name Type Description Default
file_path str

The path of the ontology file to be exported to

required
format str

The format of the ontology file, the default value is 'ttl'

'ttl'
Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def export_to_owl(cls, file_path: str, format: str = 'ttl'):
    """
    This method is used to export the ontology to an ontology file.
    It operates at the TBox level, i.e. it only exports the classes and properties of the ontology.

    Args:
        file_path (str): The path of the ontology file to be exported to
        format (str): The format of the ontology file, the default value is 'ttl'
    """
    g = cls.export_to_graph()

    # serialize
    g.serialize(destination=file_path, format=format)

BaseProperty

BaseProperty(*args, **kwargs)

Bases: set, Generic[T]

Base class that is inherited by ObjectProperty and DatatypeProperty.

Attributes:

Name Type Description
rdfs_isDefinedBy Type[BaseOntology]

The ontology that defines the property

predicate_iri str

The predicate IRI of the property

owl_minQualifiedCardinality int

The minimum qualified cardinality of the property (default is 0)

owl_maxQualifiedCardinality int

The maximum qualified cardinality of the property (default is None, meaning infinite)

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
def __init__(self, *args, **kwargs):
    super().__init__(*args, **kwargs)

retrieve_cardinality classmethod

retrieve_cardinality() -> Tuple[int, int]

This method is used to retrieve the cardinality of the property.

Returns:

Type Description
Tuple[int, int]

Tuple[int, int]: The minimum and maximum cardinality of the property

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def retrieve_cardinality(cls) -> Tuple[int, int]:
    """
    This method is used to retrieve the cardinality of the property.

    Returns:
        Tuple[int, int]: The minimum and maximum cardinality of the property
    """
    return cls.owl_minQualifiedCardinality, cls.owl_maxQualifiedCardinality

create_from_base classmethod

create_from_base(class_name: str, ontology: Type[BaseOntology], min_cardinality: Optional[int] = 0, max_cardinality: Optional[int] = None) -> Type[BaseProperty]

This method is used to create a new property class from the calling class. The new property class will inherit the min and max cardinality from the calling class if not specified.

Parameters:

Name Type Description Default
class_name str

The name of the new property class

required
ontology Type[BaseOntology]

The ontology that defines the property

required
min_cardinality Optional[int]

The minimum qualified cardinality of the property (defaults to 0)

0
max_cardinality Optional[int]

The maximum qualified cardinality of the property (defaults to None meaning infinite)

None

Returns:

Type Description
Type[BaseProperty]

Type[BaseProperty]: The new property class

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def create_from_base(
    cls,
    class_name: str,
    ontology: Type[BaseOntology],
    min_cardinality: Optional[int] = 0,
    max_cardinality: Optional[int] = None,
) -> Type[BaseProperty]:
    """
    This method is used to create a new property class from the calling class.
    The new property class will inherit the min and max cardinality from the calling class if not specified.

    Args:
        class_name (str): The name of the new property class
        ontology (Type[BaseOntology]): The ontology that defines the property
        min_cardinality (Optional[int], optional): The minimum qualified cardinality of the property (defaults to 0)
        max_cardinality (Optional[int], optional): The maximum qualified cardinality of the property (defaults to None meaning infinite)

    Returns:
        Type[BaseProperty]: The new property class
    """
    # NOTE we inherit cardinality from the calling cls if not specified
    return type(class_name, (cls,), {
        'rdfs_isDefinedBy': ontology,
        'owl_minQualifiedCardinality': min_cardinality if bool(min_cardinality) else cls.owl_minQualifiedCardinality,
        'owl_maxQualifiedCardinality': max_cardinality if bool(max_cardinality) else cls.owl_maxQualifiedCardinality,
    })

BaseClass

BaseClass(**data)

Bases: BaseModel

Base class for all the Python classes that are used to define the classes in ontology.

Attributes:

Name Type Description
rdfs_isDefinedBy BaseOntology

The ontology that defines the class

rdf_type str

The rdf:type of the class

object_lookup Dict[str, BaseClass]

A dictionary that maps the IRI of the object to the object

rdfs_comment str

The comment of the instance

rdfs_label str

The label of the instance

instance_iri str

The IRI of the instance

Example: class MyClass(BaseClass): myObjectProperty: MyObjectProperty[MyOtherClass] myDatatypeProperty: MyDatatypeProperty[str]

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
def __init__(self, **data):
    # handle the case when rdfs_comment and rdfs_label are provided as a non-set value
    if 'rdfs_comment' in data and not isinstance(data['rdfs_comment'], set):
        if isinstance(data['rdfs_comment'], list):
            data['rdfs_comment'] = set(data['rdfs_comment'])
        else:
            data['rdfs_comment'] = {data['rdfs_comment']}
    if 'rdfs_label' in data and not isinstance(data['rdfs_label'], set):
        if isinstance(data['rdfs_label'], list):
            data['rdfs_label'] = set(data['rdfs_label'])
        else:
            data['rdfs_label'] = {data['rdfs_label']}
    super().__init__(**data)

rdfs_isDefinedBy class-attribute

rdfs_isDefinedBy: BaseOntology = None

NOTE for all subclasses, one can just use rdfs_isDefinedBy = MyOntology, see this discussion in Pydantic

rdf_type class-attribute

rdf_type: str = OWL_BASE_URL + 'Class'

NOTE rdf_type is the automatically generated IRI of the class which can also be accessed at the instance level.

model_post_init

model_post_init(__context: Any) -> None

The post init process of the BaseClass. It sets the instance_iri if it is not set. It also registers the object to the lookup dictionary of the class.

Parameters:

Name Type Description Default
__context Any

Any other context that is needed for the post init process

required

Returns:

Name Type Description
None None

It calls the super().model_post_init(__context) to finish the post init process

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
def model_post_init(self, __context: Any) -> None:
    """
    The post init process of the BaseClass.
    It sets the instance_iri if it is not set.
    It also registers the object to the lookup dictionary of the class.

    Args:
        __context (Any): Any other context that is needed for the post init process

    Returns:
        None: It calls the super().model_post_init(__context) to finish the post init process
    """
    if not bool(self.instance_iri):
        self.instance_iri = self.__class__.init_instance_iri()
    # set new instance to the global look up table, so that we can avoid creating the same instance multiple times
    self._register_object()
    return super().model_post_init(__context)

retrieve_subclass classmethod

retrieve_subclass(iri: str) -> Type[BaseClass]

This function retrieves the subclass of the current class based on the IRI. If the IRI is the same as the rdf:type of the current class, it will return the current class itself.

Parameters:

Name Type Description Default
iri str

The IRI of the subclass

required

Returns:

Type Description
Type[BaseClass]

Type[BaseClass]: The subclass of the BaseClass

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def retrieve_subclass(cls, iri: str) -> Type[BaseClass]:
    """
    This function retrieves the subclass of the current class based on the IRI.
    If the IRI is the same as the rdf:type of the current class, it will return the current class itself.

    Args:
        iri (str): The IRI of the subclass

    Returns:
        Type[BaseClass]: The subclass of the BaseClass
    """
    if iri == cls.rdf_type:
        return cls
    return cls.construct_subclass_dictionary()[iri]

construct_subclass_dictionary classmethod

construct_subclass_dictionary() -> Dict[str, Type[BaseClass]]

This function constructs a dictionary that maps the rdf:type to the subclass of the BaseClass.

Returns:

Type Description
Dict[str, Type[BaseClass]]

Dict[str, Type[BaseClass]]: The dictionary that maps the rdf:type to the subclass of the BaseClass

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def construct_subclass_dictionary(cls) -> Dict[str, Type[BaseClass]]:
    """
    This function constructs a dictionary that maps the rdf:type to the subclass of the BaseClass.

    Returns:
        Dict[str, Type[BaseClass]]: The dictionary that maps the rdf:type to the subclass of the BaseClass
    """
    subclass_dict = {}
    for clz in cls.__subclasses__():
        subclass_dict[clz.rdf_type] = clz
        # recursively add the subclass of the subclass
        subclass_dict.update(clz.construct_subclass_dictionary())
    return subclass_dict

push_all_instances_to_kg classmethod

push_all_instances_to_kg(sparql_client: PySparqlClient, recursive_depth: int = 0, force_overwrite_local: bool = False)

This function pushes all the instances of the class to the knowledge graph.

Parameters:

Name Type Description Default
sparql_client PySparqlClient

The SPARQL client that is used to push the data to the KG

required
recursive_depth int

The depth of the recursion, 0 means no recursion, -1 means infinite recursion, n means n-level recursion

0
force_overwrite_local bool

Whether to force overwrite the local values with the remote values

False
Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def push_all_instances_to_kg(
    cls,
    sparql_client: PySparqlClient,
    recursive_depth: int = 0,
    force_overwrite_local: bool = False,
):
    """
    This function pushes all the instances of the class to the knowledge graph.

    Args:
        sparql_client (PySparqlClient): The SPARQL client that is used to push the data to the KG
        recursive_depth (int): The depth of the recursion, 0 means no recursion, -1 means infinite recursion, n means n-level recursion
        force_overwrite_local (bool): Whether to force overwrite the local values with the remote values
    """
    g_to_remove = Graph()
    g_to_add = Graph()
    cls.pull_from_kg(cls.object_lookup.keys(), sparql_client, recursive_depth, force_overwrite_local)
    for obj in cls.object_lookup.values():
        g_to_remove, g_to_add = obj._collect_diff_to_graph(g_to_remove, g_to_add, recursive_depth)
    sparql_client.delete_and_insert_graphs(g_to_remove, g_to_add)
    return g_to_remove, g_to_add

clear_object_lookup classmethod

clear_object_lookup()

This function clears the lookup dictionary of the class.

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def clear_object_lookup(cls):
    """
    This function clears the lookup dictionary of the class.
    """
    if cls.object_lookup is not None:
        iris = list(cls.object_lookup.keys())
        for i in iris:
            del cls.object_lookup[i]

pull_from_kg classmethod

pull_from_kg(iris: List[str], sparql_client: PySparqlClient, recursive_depth: int = 0, force_overwrite_local: bool = False) -> List[BaseClass]

This function pulls the objects from the KG based on the given IRIs.

Parameters:

Name Type Description Default
iris List[str]

The list of IRIs of the objects that one wants to pull from the KG

required
sparql_client PySparqlClient

The SPARQL client that is used to pull the data from the KG

required
recursive_depth int

The depth of the recursion, 0 means no recursion, -1 means infinite recursion, n means n-level recursion

0
force_overwrite_local bool

Whether to force overwrite the local values with the remote values

False

Raises:

Type Description
ValueError

The rdf:type of the IRI provided does not match the calling class

Returns:

Type Description
List[BaseClass]

List[BaseClass]: A list of objects that are pulled from the KG

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def pull_from_kg(
    cls,
    iris: List[str],
    sparql_client: PySparqlClient,
    recursive_depth: int = 0,
    force_overwrite_local: bool = False,
) -> List[BaseClass]:
    """
    This function pulls the objects from the KG based on the given IRIs.

    Args:
        iris (List[str]): The list of IRIs of the objects that one wants to pull from the KG
        sparql_client (PySparqlClient): The SPARQL client that is used to pull the data from the KG
        recursive_depth (int): The depth of the recursion, 0 means no recursion, -1 means infinite recursion, n means n-level recursion
        force_overwrite_local (bool): Whether to force overwrite the local values with the remote values

    Raises:
        ValueError: The rdf:type of the IRI provided does not match the calling class

    Returns:
        List[BaseClass]: A list of objects that are pulled from the KG
    """
    if isinstance(iris, str):
        iris = [iris]
    iris = set(iris)
    # if the iris are not provided, then just return empty list
    if not bool(iris):
        return []
    # prepare the list to be returned
    instance_lst = []

    # check if any of the iris are loading
    i_loading = set()
    for i in iris:
        if KnowledgeGraph._is_iri_been_loading(i):
            # for those that are loading, use string here and remove it from query
            instance_lst.append(i)
            i_loading.add(i)
        else:
            # for those that are not loading, indicate they are to be loaded now
            KnowledgeGraph._add_iri_to_loading(i)
    iris = iris - i_loading

    # behaviour of recursive_depth: 0 means no recursion, -1 means infinite recursion, n means n-level recursion
    flag_pull = abs(recursive_depth) > 0
    recursive_depth = max(recursive_depth - 1, 0) if recursive_depth > -1 else max(recursive_depth - 1, -1)
    # TODO what do we do with undefined properties in python class? - write a warning message or we can add them to extra_fields https://docs.pydantic.dev/latest/concepts/models/#extra-fields
    # return format: {iri: {predicate: {object}}}
    node_dct = sparql_client.get_outgoing_and_attributes(iris)
    for iri, props in node_dct.items():
        # TODO optimise the time complexity of the following code when the number of instances is large
        # check if the rdf:type of the instance matches the calling class or any of its subclasses
        target_clz_rdf_types = set(props.get(RDF.type.toPython(), [])) # NOTE this supports instance instantiated with multiple rdf:type
        if not target_clz_rdf_types:
            raise ValueError(f"The instance {iri} has no rdf:type, retrieved outgoing links and attributes: {props}.")
        cls_subclasses = set(cls.construct_subclass_dictionary().keys())
        cls_subclasses.add(cls.rdf_type)
        intersection = target_clz_rdf_types & cls_subclasses
        if intersection:
            if len(intersection) == 1:
                target_clz_rdf_type = next(iter(intersection))
            else:
                # NOTE instead of using the first element of the intersection
                # we find the deepest subclass as target_clz_rdf_type
                # so that the created object could inherite all the properties of its parent classes
                # which prevents the loss of information
                parent_classes = set()
                for c in intersection:
                    if c in parent_classes:
                        # skip if it's already a parent class
                        continue
                    for other in intersection:
                        if other != c and issubclass(cls.retrieve_subclass(c), cls.retrieve_subclass(other)):
                            parent_classes.add(other)
                deepest_subclasses = intersection - parent_classes
                if len(deepest_subclasses) > 1:
                    # TODO [future] add support for allowing users to specify the target class
                    KnowledgeGraph._remove_iri_from_loading(iri)
                    raise ValueError(
                        f"""The instance {iri} is of type {target_clz_rdf_types}.
                        Amongst the pulling class {cls.__name__} ({cls.rdf_type})
                        and its subclasses ({cls.construct_subclass_dictionary()}),
                        there exist classes that are not in the same branch of the inheritance tree,
                        including {deepest_subclasses},
                        therefore it cannot be instantiated by pulling with class {cls.__name__}.
                        Please consider pulling the instance directly with one of the class in {deepest_subclasses}
                        Alternatively, please check the inheritance tree is correctly defined in Python.""")
                else:
                    target_clz_rdf_type = next(iter(deepest_subclasses))
        else:
            # if there's any error, remove the iri from the loading status
            # otherwise it will block any further pulling of the same object
            KnowledgeGraph._remove_iri_from_loading(iri)
            raise ValueError(
                f"""The instance {iri} is of type {target_clz_rdf_types},
                it doesn't match the rdf:type of class {cls.__name__} ({cls.rdf_type}),
                nor any of its subclasses ({cls.construct_subclass_dictionary()}),
                therefore it cannot be instantiated.""")
        inst = KnowledgeGraph.get_object_from_lookup(iri)
        # obtain the target class in case it is a subclass
        target_clz = cls.retrieve_subclass(target_clz_rdf_type)
        # rebuild the model in case there're any ForwardRef that were not resolved previously
        target_clz.model_rebuild()

        # instead of calling cls.get_object_properties() and cls.get_data_properties()
        # calling methods of target_clz ensures that all properties are correctly inherited
        ops = target_clz.get_object_properties()
        dps = target_clz.get_data_properties()
        # handle object properties (where the recursion happens)
        # the situation where two instances pointing to each other (or if there's circular nodes)
        #   is enabled by stopping pulling at KnowledgeGraph.iri_loading_in_progress
        # here object_properties_dict is a fetch of the remote KG
        object_properties_dict = {}
        for op_iri, op_dct in ops.items():
            _set = set()
            if op_iri in props:
                if flag_pull:
                    c_tp: BaseClass = get_args(op_dct['type'])[0]
                    _set = c_tp.pull_from_kg(props[op_iri], sparql_client, recursive_depth, force_overwrite_local)
                else:
                    _set = set(props[op_iri])
            object_properties_dict[op_dct['field']] = _set
        # here we handle data properties (data_properties_dict is a fetch of the remote KG)
        data_properties_dict = {}
        for dp_iri, dp_dct in dps.items():
            if dp_iri in props:
                # here we need to convert the data property to the correct type
                _dp_tp = get_args(dp_dct['type'])[0]
                data_properties_dict[dp_dct['field']] = set([_dp_tp(_) for _ in props[dp_iri]])
            else:
                data_properties_dict[dp_dct['field']] = set()
        # handle rdfs:label and rdfs:comment (also fetch of the remote KG)
        rdfs_properties_dict = {}
        if RDFS.label.toPython() in props:
            rdfs_properties_dict['rdfs_label'] = set(list(props[RDFS.label.toPython()]))
        if RDFS.comment.toPython() in props:
            rdfs_properties_dict['rdfs_comment'] = set(list(props[RDFS.comment.toPython()]))
        # instantiate the object
        if inst is not None and type(inst) is target_clz:
            for op_iri, op_dct in ops.items():
                if flag_pull:
                    # below lines pull those object properties that are NOT connected in the remote KG,
                    # but are connected in the local python memory
                    # e.g. object `a` has a field `to_b` that points to object `b`
                    # but triple <a> <to_b> <b> does not exist in the KG
                    # this code then ensures the cache of object `b` is accurate
                    # TODO [future] below query can be combined with those connected in the KG to save amount of queries
                    c_tp: BaseClass = get_args(op_dct['type'])[0]
                    _o = getattr(inst, op_dct['field']) if getattr(inst, op_dct['field']) is not None else set()
                    c_tp.pull_from_kg(
                        set([o.instance_iri if isinstance(o, BaseClass) else o for o in _o]) - set(props.get(op_iri, [])),
                        sparql_client, recursive_depth, force_overwrite_local)
            # now collect all featched values
            fetched = {
                k: set([o.instance_iri if isinstance(o, BaseClass) else o for o in v])
                for k, v in object_properties_dict.items()
            } # object properties
            fetched.update({k: set(copy.deepcopy(v)) for k, v in data_properties_dict.items()}) # data properties
            fetched.update(rdfs_properties_dict) # rdfs properties
            # compare it with cached values and local values for all object/data/rdfs properties
            # if the object is already in the lookup, then update the object for those fields that are not modified in the python
            try:
                inst._update_according_to_fetch(fetched, flag_pull, force_overwrite_local)
            except Exception as e:
                # if there's any error, remove the iri from the loading status
                # otherwise it will block any further pulling of the same object
                KnowledgeGraph._remove_iri_from_loading(inst.instance_iri)
                raise e
        else:
            # if the object is not in the lookup, create a new object
            inst = target_clz(
                instance_iri=iri,
                **rdfs_properties_dict,
                **object_properties_dict,
                **data_properties_dict,
            )
            inst._create_cache()

        inst._exist_in_kg = True
        # update cache here
        instance_lst.append(inst)
        # remote inst from the loading status
        KnowledgeGraph._remove_iri_from_loading(inst.instance_iri)
    return instance_lst

pull_all_instances_from_kg classmethod

pull_all_instances_from_kg(sparql_client: PySparqlClient, recursive_depth: int = 0, force_overwrite_local: bool = False) -> Set[BaseClass]

This function pulls all instances of the calling class from the knowledge graph (triplestore). It calls the pull_from_kg function with the IRIs of all instances of the calling class. By default, it pulls the instances with no recursion.

Parameters:

Name Type Description Default
sparql_client PySparqlClient

The SPARQL client that is used to pull the data from the KG

required
recursive_depth int

The depth of the recursion, 0 means no recursion, -1 means infinite recursion, n means n-level recursion

0
force_overwrite_local bool

Whether to force overwrite the local values with the remote values

False

Returns:

Type Description
Set[BaseClass]

Set[BaseClass]: A set of objects that are pulled from the KG

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def pull_all_instances_from_kg(
    cls,
    sparql_client: PySparqlClient,
    recursive_depth: int = 0,
    force_overwrite_local: bool = False,
) -> Set[BaseClass]:
    """
    This function pulls all instances of the calling class from the knowledge graph (triplestore).
    It calls the pull_from_kg function with the IRIs of all instances of the calling class.
    By default, it pulls the instances with no recursion.

    Args:
        sparql_client (PySparqlClient): The SPARQL client that is used to pull the data from the KG
        recursive_depth (int): The depth of the recursion, 0 means no recursion, -1 means infinite recursion, n means n-level recursion
        force_overwrite_local (bool): Whether to force overwrite the local values with the remote values

    Returns:
        Set[BaseClass]: A set of objects that are pulled from the KG
    """
    iris = sparql_client.get_all_instances_of_class(cls.rdf_type)
    return cls.pull_from_kg(iris, sparql_client, recursive_depth, force_overwrite_local)

get_object_and_data_properties classmethod

get_object_and_data_properties() -> Dict[str, Dict[str, Union[str, Type[BaseProperty]]]]

This function returns the object and data properties of the calling class. This method calls the get_object_properties and get_data_properties functions and returns the combined dictionary.

Returns:

Type Description
Dict[str, Dict[str, Union[str, Type[BaseProperty]]]]

Dict[str, Dict[str, Union[str, Type[BaseProperty]]]]: A dictionary containing the object and data properties of the calling class

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def get_object_and_data_properties(cls) -> Dict[str, Dict[str, Union[str, Type[BaseProperty]]]]:
    """
    This function returns the object and data properties of the calling class.
    This method calls the get_object_properties and get_data_properties functions and returns the combined dictionary.

    Returns:
        Dict[str, Dict[str, Union[str, Type[BaseProperty]]]]: A dictionary containing the object and data properties of the calling class
    """
    return {**cls.get_object_properties(), **cls.get_data_properties()}

get_object_properties classmethod

get_object_properties() -> Dict[str, Dict[str, Union[str, Type[ObjectProperty]]]]

This function returns the object properties of the calling class.

Returns:

Type Description
Dict[str, Dict[str, Union[str, Type[ObjectProperty]]]]

Dict[str, Union[str, Type[ObjectProperty]]]]: A dictionary containing the object properties of the calling class in the format of {predicate_iri: {'field': field_name, 'type': field_clz}} e.g. {'https://twa.com/myObjectProperty': {'field': 'myObjectProperty', 'type': MyObjectProperty}}

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def get_object_properties(cls) -> Dict[str, Dict[str, Union[str, Type[ObjectProperty]]]]:
    """
    This function returns the object properties of the calling class.

    Returns:
        Dict[str, Union[str, Type[ObjectProperty]]]]: A dictionary containing the object properties of the calling class
            in the format of {predicate_iri: {'field': field_name, 'type': field_clz}}
            e.g. {'https://twa.com/myObjectProperty': {'field': 'myObjectProperty', 'type': MyObjectProperty}}
    """
    dct_op = {}
    for f, field_info in cls.model_fields.items():
        op = get_args(field_info.annotation)[0] if type(field_info.annotation) == _UnionGenericAlias else field_info.annotation
        if ObjectProperty._is_inherited(op):
            dct_op[op.predicate_iri] = {'field': f, 'type': op}
    return dct_op

get_data_properties classmethod

get_data_properties() -> Dict[str, Dict[str, Union[str, Type[DatatypeProperty]]]]

This function returns the data properties of the calling class.

Returns:

Type Description
Dict[str, Dict[str, Union[str, Type[DatatypeProperty]]]]

Dict[str, Dict[str, Union[str, Type[DatatypeProperty]]]]: A dictionary containing the data properties of the calling class in the format of {predicate_iri: {'field': field_name, 'type': field_clz}} e.g. {'https://twa.com/myDatatypeProperty': {'field': 'myDatatypeProperty', 'type': MyDatatypeProperty}}

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def get_data_properties(cls) -> Dict[str, Dict[str, Union[str, Type[DatatypeProperty]]]]:
    """
    This function returns the data properties of the calling class.

    Returns:
        Dict[str, Dict[str, Union[str, Type[DatatypeProperty]]]]: A dictionary containing the data properties of the calling class
            in the format of {predicate_iri: {'field': field_name, 'type': field_clz}}
            e.g. {'https://twa.com/myDatatypeProperty': {'field': 'myDatatypeProperty', 'type': MyDatatypeProperty}}
    """
    dct_dp = {}
    for f, field_info in cls.model_fields.items():
        dp = get_args(field_info.annotation)[0] if type(field_info.annotation) == _UnionGenericAlias else field_info.annotation
        if DatatypeProperty._is_inherited(dp):
            dct_dp[dp.predicate_iri] = {'field': f, 'type': dp}
    return dct_dp

revert_local_changes

revert_local_changes()

This function reverts the local changes made to the python object to cached values.

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
def revert_local_changes(self):
    """ This function reverts the local changes made to the python object to cached values. """
    for f, field_info in self.model_fields.items():
        if BaseProperty._is_inherited(field_info.annotation):
            setattr(self, f, copy.deepcopy(self._latest_cache.get(f, field_info.annotation(set()))))
        else:
            setattr(self, f, copy.deepcopy(self._latest_cache.get(f, None)))

get_object_property_by_iri

get_object_property_by_iri(iri: str) -> ObjectProperty

This function returns the object property by the IRI of the property.

Parameters:

Name Type Description Default
iri str

IRI of the object property

required

Returns:

Name Type Description
ObjectProperty ObjectProperty

The object property

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
def get_object_property_by_iri(self, iri: str) -> ObjectProperty:
    """
    This function returns the object property by the IRI of the property.

    Args:
        iri (str): IRI of the object property

    Returns:
        ObjectProperty: The object property
    """
    dct = self.__class__.get_object_properties()
    field_name = dct.get(iri, {}).get('field', None)
    if field_name is not None:
        return getattr(self, field_name)
    else:
        return None

push_to_kg

push_to_kg(sparql_client: PySparqlClient, recursive_depth: int = 0, pull_first: bool = False, maximum_retry: int = 0, force_overwrite_if_pull_first: bool = False) -> Tuple[Graph, Graph]

This function pushes the triples of the calling object to the knowledge graph (triplestore).

Parameters:

Name Type Description Default
sparql_client PySparqlClient

The SPARQL client object to be used to push the triples

required
recursive_depth int

The depth of the recursion, 0 means no recursion, -1 means infinite recursion, n means n-level recursion

0
pull_first bool

Whether to pull the latest triples from the KG before pushing the triples

False
maximum_retry int

The number of retries if any exception was raised during SPARQL update

0
force_overwrite_if_pull_first bool

Whether to force overwrite the local values with the remote values if pull_first is True

False

Returns:

Type Description
Tuple[Graph, Graph]

Tuple[Graph, Graph]: A tuple of two rdflib.Graph objects containing the triples to be removed and added

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
def push_to_kg(
    self,
    sparql_client: PySparqlClient,
    recursive_depth: int = 0,
    pull_first: bool = False,
    maximum_retry: int = 0,
    force_overwrite_if_pull_first: bool = False,
) -> Tuple[Graph, Graph]:
    """
    This function pushes the triples of the calling object to the knowledge graph (triplestore).

    Args:
        sparql_client (PySparqlClient): The SPARQL client object to be used to push the triples
        recursive_depth (int): The depth of the recursion, 0 means no recursion, -1 means infinite recursion, n means n-level recursion
        pull_first (bool): Whether to pull the latest triples from the KG before pushing the triples
        maximum_retry (int): The number of retries if any exception was raised during SPARQL update
        force_overwrite_if_pull_first (bool): Whether to force overwrite the local values with the remote values if `pull_first` is `True`

    Returns:
        Tuple[Graph, Graph]: A tuple of two rdflib.Graph objects containing the triples to be removed and added
    """
    # TODO [future] what happens when KG changed during processing in the python side? race conditions...
    # NOTE when push, the objects in memory are loaded to collect diff and only stops when it's string (i.e. no object cached)
    # this supports the situation where recursive_depth specified here is greater than the value used to pull the object

    # pull the latest triples from the KG if needed
    if pull_first:
        self.__class__.pull_from_kg(self.instance_iri, sparql_client, recursive_depth, force_overwrite_if_pull_first)
    # type of changes: remove old triples, add new triples
    g_to_remove = Graph()
    g_to_add = Graph()
    g_to_remove, g_to_add = self._collect_diff_to_graph(g_to_remove, g_to_add, recursive_depth)

    # retry push if any exception is raised
    retry_delay = 2
    for attempt in range(0, maximum_retry +1):
        try:
            sparql_client.delete_and_insert_graphs(g_to_remove, g_to_add)
            # if no exception was thrown, update cache
            self._create_cache(recursive_depth)
            return g_to_remove, g_to_add
        except Exception as e:
            if attempt < maximum_retry:
                time.sleep(retry_delay)
            else:
                raise e

graph

graph(g: Graph = None) -> Graph

This method adds all the outgoing triples of the calling object.

Parameters:

Name Type Description Default
g Graph

The rdflib.Graph object to which the triples should be added

None

Returns:

Name Type Description
Graph Graph

The rdflib.Graph object containing the triples added

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
def graph(self, g: Graph = None) -> Graph:
    """
    This method adds all the outgoing triples of the calling object.

    Args:
        g (Graph, optional): The rdflib.Graph object to which the triples should be added

    Returns:
        Graph: The rdflib.Graph object containing the triples added
    """
    if g is None:
        g = Graph()
    g.add((URIRef(self.instance_iri), RDF.type, URIRef(self.rdf_type)))
    for f, field_info in self.model_fields.items():
        tp = get_args(field_info.annotation)[0] if type(field_info.annotation) == _UnionGenericAlias else field_info.annotation
        if ObjectProperty._is_inherited(tp):
            tp: ObjectProperty
            prop = getattr(self, f) if getattr(self, f) is not None else set()
            for o in prop:
                g.add((URIRef(self.instance_iri), URIRef(tp.predicate_iri), URIRef(o.instance_iri if isinstance(o, BaseClass) else o)))
        elif DatatypeProperty._is_inherited(tp):
            tp: DatatypeProperty
            prop = getattr(self, f) if getattr(self, f) is not None else set()
            for o in prop:
                g.add((URIRef(self.instance_iri), URIRef(tp.predicate_iri), Literal(o)))
        elif f == 'rdfs_comment' and bool(self.rdfs_comment):
            for comment in self.rdfs_comment:
                g.add((URIRef(self.instance_iri), RDFS.comment, Literal(comment)))
        elif f == 'rdfs_label' and bool(self.rdfs_label):
            for label in self.rdfs_label:
                g.add((URIRef(self.instance_iri), RDFS.label, Literal(label)))
    return g

triples

triples() -> str

This method generates the turtle representation for all outgoing triples of the calling object.

Returns:

Name Type Description
str str

The outgoing triples in turtle format

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
def triples(self) -> str:
    """
    This method generates the turtle representation for all outgoing triples of the calling object.

    Returns:
        str: The outgoing triples in turtle format
    """
    return self.graph().serialize(format='ttl')

ObjectProperty

ObjectProperty(*args, **kwargs)

Bases: BaseProperty

Base class for object properties. It inherits the BaseProperty class.

Attributes:

Name Type Description
rdfs_isDefinedBy

The ontology that defines the property

predicate_iri

The predicate IRI of the property

owl_minQualifiedCardinality

The minimum qualified cardinality of the property (default is 0)

owl_maxQualifiedCardinality

The maximum qualified cardinality of the property (default is None, meaning infinite)

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
def __init__(self, *args, **kwargs):
    super().__init__(*args, **kwargs)

retrieve_cardinality classmethod

retrieve_cardinality() -> Tuple[int, int]

This method is used to retrieve the cardinality of the property.

Returns:

Type Description
Tuple[int, int]

Tuple[int, int]: The minimum and maximum cardinality of the property

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def retrieve_cardinality(cls) -> Tuple[int, int]:
    """
    This method is used to retrieve the cardinality of the property.

    Returns:
        Tuple[int, int]: The minimum and maximum cardinality of the property
    """
    return super().retrieve_cardinality()

create_from_base classmethod

create_from_base(class_name: str, ontology: Type[BaseOntology], min_cardinality: Optional[int] = 0, max_cardinality: Optional[int] = None) -> Type[ObjectProperty]

This method is used to create a new property class from the calling property class. The new property class will inherit the min and max cardinality from the calling class if not specified.

Parameters:

Name Type Description Default
class_name str

The name of the new property class

required
ontology Type[BaseOntology]

The ontology that defines the property

required
min_cardinality Optional[int]

The minimum qualified cardinality of the property (defaults to 0)

0
max_cardinality Optional[int]

The maximum qualified cardinality of the property (defaults to None meaning infinite)

None

Returns:

Type Description
Type[ObjectProperty]

Type[ObjectProperty]: The new property class

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def create_from_base(
    cls,
    class_name: str,
    ontology: Type[BaseOntology],
    min_cardinality: Optional[int] = 0,
    max_cardinality: Optional[int] = None,
) -> Type[ObjectProperty]:
    """
    This method is used to create a new property class from the calling property class.
    The new property class will inherit the min and max cardinality from the calling class if not specified.

    Args:
        class_name (str): The name of the new property class
        ontology (Type[BaseOntology]): The ontology that defines the property
        min_cardinality (Optional[int], optional): The minimum qualified cardinality of the property (defaults to 0)
        max_cardinality (Optional[int], optional): The maximum qualified cardinality of the property (defaults to None meaning infinite)

    Returns:
        Type[ObjectProperty]: The new property class
    """
    # NOTE we inherit cardinality from the calling cls if not specified
    return super().create_from_base(class_name, ontology, min_cardinality, max_cardinality)

TransitiveProperty

TransitiveProperty(*args, **kwargs)

Bases: ObjectProperty

Base class for transitive object properties. It inherits the ObjectProperty class.

Attributes:

Name Type Description
rdfs_isDefinedBy

The ontology that defines the property

predicate_iri

The predicate IRI of the property

owl_minQualifiedCardinality

The minimum qualified cardinality of the property (default is 0)

owl_maxQualifiedCardinality

The maximum qualified cardinality of the property (default is None, meaning infinite)

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
def __init__(self, *args, **kwargs):
    super().__init__(*args, **kwargs)

obtain_transitive_objects classmethod

obtain_transitive_objects(instance: Union[BaseClass, str]) -> Set

This function obtains the transitive objects of the instance for the transitive object property.

Parameters:

Name Type Description Default
instance Union[BaseClass, str]

The instance for which the transitive objects are to be obtained

required

Returns:

Name Type Description
Set Set

The set that contains the transitive objects

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def obtain_transitive_objects(cls, instance: Union[BaseClass, str]) -> Set:
    """
    This function obtains the transitive objects of the instance for the transitive object property.

    Args:
        instance (Union[BaseClass, str]): The instance for which the transitive objects are to be obtained

    Returns:
        Set: The set that contains the transitive objects
    """
    # check if instance is a string and look it up in the knowledge graph
    if isinstance(instance, str):
        _inst = KnowledgeGraph.get_object_from_lookup(instance)
        if _inst is None:
            # warn if the instance is not found
            # there could be further transitive objects in the remote knowledge graph
            # but they are not looked up here
            warnings.warn(f"Transitive objects for object property {cls.predicate_iri} not looked up beyond instance {instance} as it is not found in the Python memory.")
            return set()
        else:
            instance = _inst

    # get the transitive objects from the instance using the predicate IRI
    _transitive_objects = instance.get_object_property_by_iri(cls.predicate_iri)
    # initialise the transitive objects set with a deep copy of _transitive_objects, or an empty set if it's None
    transitive_objects = set(copy.deepcopy(_transitive_objects)) if _transitive_objects else set()

    # if there are no transitive objects, return the initialised set (which is an empty set)
    if not _transitive_objects:
        return transitive_objects

    # recursively find and accumulate transitive objects for each object in _transitive_objects
    for o in _transitive_objects:
        transitive_objects = transitive_objects.union(cls.obtain_transitive_objects(o))

    return transitive_objects

DatatypeProperty

DatatypeProperty(*args, **kwargs)

Bases: BaseProperty

Base class for data properties. It inherits the BaseProperty class.

Attributes:

Name Type Description
rdfs_isDefinedBy

The ontology that defines the property

predicate_iri

The predicate IRI of the property

owl_minQualifiedCardinality

The minimum qualified cardinality of the property (default is 0)

owl_maxQualifiedCardinality

The maximum qualified cardinality of the property (default is None, meaning infinite)

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
def __init__(self, *args, **kwargs):
    super().__init__(*args, **kwargs)

retrieve_cardinality classmethod

retrieve_cardinality() -> Tuple[int, int]

This method is used to retrieve the cardinality of the property.

Returns:

Type Description
Tuple[int, int]

Tuple[int, int]: The minimum and maximum cardinality of the property

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def retrieve_cardinality(cls) -> Tuple[int, int]:
    """
    This method is used to retrieve the cardinality of the property.

    Returns:
        Tuple[int, int]: The minimum and maximum cardinality of the property
    """
    return super().retrieve_cardinality()

create_from_base classmethod

create_from_base(class_name: str, ontology: Type[BaseOntology], min_cardinality: Optional[int] = 0, max_cardinality: Optional[int] = None) -> Type[DatatypeProperty]

This method is used to create a new property class from the calling property class. The new property class will inherit the min and max cardinality from the calling class if not specified.

Parameters:

Name Type Description Default
class_name str

The name of the new property class

required
ontology Type[BaseOntology]

The ontology that defines the property

required
min_cardinality Optional[int]

The minimum qualified cardinality of the property (defaults to 0)

0
max_cardinality Optional[int]

The maximum qualified cardinality of the property (defaults to None meaning infinite)

None

Returns:

Type Description
Type[DatatypeProperty]

Type[DatatypeProperty]: The new property class

Source code in JPS_BASE_LIB/python_wrapper/twa/data_model/base_ontology.py
@classmethod
def create_from_base(
    cls,
    class_name: str,
    ontology: Type[BaseOntology],
    min_cardinality: Optional[int] = 0,
    max_cardinality: Optional[int] = None,
) -> Type[DatatypeProperty]:
    """
    This method is used to create a new property class from the calling property class.
    The new property class will inherit the min and max cardinality from the calling class if not specified.

    Args:
        class_name (str): The name of the new property class
        ontology (Type[BaseOntology]): The ontology that defines the property
        min_cardinality (Optional[int], optional): The minimum qualified cardinality of the property (defaults to 0)
        max_cardinality (Optional[int], optional): The maximum qualified cardinality of the property (defaults to None meaning infinite)

    Returns:
        Type[DatatypeProperty]: The new property class
    """
    # NOTE we inherit cardinality from the calling cls if not specified
    return super().create_from_base(class_name, ontology, min_cardinality, max_cardinality)