Hey team, I'm encountering the following with ver...
# help
m
Hey team, I'm encountering the following with version 0.103.0:
Copy code
KeyError: '_data_store'

File ".python3.8/site-packages/lakefs_client/model_utils.py", line 189, in __getattr__
    return self.__getitem__(attr)
  File "./python3.8/site-packages/lakefs_client/model_utils.py", line 521, in __getitem__
    if name in self:
  File "./python3.8/site-packages/lakefs_client/model_utils.py", line 535, in __contains__
    return name in self.__dict__['_data_store']
Is it possible to help me understand why this would happen? For slightly more context, I execute a script and this happens every other time I run the script.. I'm led to beileve the trace of this error starts here:
Copy code
from lakefs_client import models
It looks like the name that is not found is:
Copy code
__setstate__
b
Hi @Matthieu Oliveira
Copy code
python3
Python 3.11.3 (main, Apr  7 2023, 20:13:31) [Clang 14.0.0 (clang-1400.0.29.202)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> from lakefs_client import models
>>>
any specific is executed at the point of the error or it is the import itself?
It is a generated code and the recommendation is to import the specific model you require
👍 1
m
The whole stack trace is:
Copy code
KeyError: '_data_store'

Stack Trace:
  File "./python3.8/site-packages/dagster/_core/execution/plan/utils.py", line 54, in op_execution_error_boundary
    yield
  File "./python3.8/site-packages/dagster/_core/execution/plan/inputs.py", line 819, in _load_input_with_input_manager
    value = input_manager.load_input(context)
  File "./python3.8/site-packages/dagster/_core/storage/upath_io_manager.py", line 191, in load_input
    return self._load_single_input(path, context)
  File "./python3.8/site-packages/dagster/_core/storage/upath_io_manager.py", line 134, in _load_single_input
    obj = self.load_from_path(context=context, path=path)
  File "./python3.8/site-packages/dagster/_core/storage/fs_io_manager.py", line 173, in load_from_path
    return pickle.load(file)
  File "./python3.8/site-packages/lakefs_client/model_utils.py", line 189, in __getattr__
    return self.__getitem__(attr)
  File "./python3.8/site-packages/lakefs_client/model_utils.py", line 521, in __getitem__
    if name in self:
  File "./python3.8/site-packages/lakefs_client/model_utils.py", line 536, in __contains__
    return name in self.__dict__['_data_store']
It's possible it's an issue with dagster but maybe this will help
b
like
Copy code
from lakefs_client.model.diff_list import DiffList
m
yeh good call... I'll import CommitCreation diretly
b
from the openapi generated code -
Copy code
import all models into this package (models).
if you have many models here with many references from one model to another this may
raise a RecursionError
to avoid this, import only the models that you directly need like:
from from lakefs_client.model.pet import Pet (example)
or import this package, but before doing it, use:
import sys
sys.setrecursionlimit(n)
hope it helps
a
Not a Python or Dagster expert, but I am suspicious of that pickle.load call. Might some Dagster code be serialising something that it shouldn't be?
🤔 1
m
It's possible, i'll have to look into their implementation. It happens every other time the code runs. I have another dagster op where I import
Copy code
from lakefs_client.model.commit_creation import CommitCreation
with lakefs_client.ApiClient(configuration) as api_client:
    # Create an instance of the API class
    api_instance = commits_api.CommitsApi(api_client)
    repository = "my-repo"
    branch = "main"
    commit_creation = CommitCreation(
        message="my-example",
        metadata={
            "key": "my-value",
        },
        date='06-13-2023',
    )
and this snippet commits.. so going to replace what's breaking with this and see if this works
🙏 1