Importing zexp from Plone4.3.* to Plone5.1rc failed - plone-4.x

Import failed at http://127.0.0.1:8080/Plone/en/manage_importObject
What is this message mean?
How to add uid_catalog?
Traceback (innermost last):
Module ZPublisher.Publish, line 138, in publish
Module ZPublisher.mapply, line 77, in mapply
Module ZPublisher.Publish, line 48, in call_object
Module OFS.ObjectManager, line 627, in manage_importObject
Module OFS.ObjectManager, line 649, in _importObjectFromFile
Module Products.BTreeFolder2.BTreeFolder2, line 461, in _setObject
Module zope.event, line 31, in notify
Module zope.component.event, line 24, in dispatch
Module zope.component._api, line 136, in subscribers
Module zope.component.registry, line 321, in subscribers
Module zope.interface.adapter, line 585, in subscribers
Module zope.component.event, line 32, in objectEventNotify
Module zope.component._api, line 136, in subscribers
Module zope.component.registry, line 321, in subscribers
Module zope.interface.adapter, line 585, in subscribers
Module OFS.subscribers, line 110, in dispatchObjectMovedEvent
Module OFS.subscribers, line 143, in callManageAfterAdd
Module Products.Archetypes.BaseFolder, line 95, in manage_afterAdd
Module Products.Archetypes.BaseObject, line 160, in manage_afterAdd
- __traceback_info__: (<ATFolder at /Plone/en/news>, <ATFolder at /Plone/en/news>, <Folder at /Plone/en>)
Module Products.Archetypes.Referenceable, line 242, in manage_afterAdd
Module Products.Archetypes.Referenceable, line 212, in _updateCatalog
Module Products.CMFCore.utils, line 13, in check_getToolByName
Module Products.CMFCore.utils, line 120, in getToolByName
AttributeError: uid_catalog

Make uid_catalog yourself.
Goto http://127.0.0.1:8080/Plone/manage_main
Choose Zcatalog from selectbox
Push Add
screen shot of manage_main

Related

Encountering type error handling when running allennlp test-install

I am using Windows 10 and pip installed the latest allennlp branch. When I successfully install the package, I encountered the following:
$ allennlp test-install
Traceback (most recent call last):
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\Users\604194\AppData\Local\Continuum\anaconda3\envs\domains\Scripts\allennlp.exe\__main__.py", line 4, in <module>
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\site-packages\allennlp\run.py", line 15, in <module>
from allennlp.commands import main # pylint: disable=wrong-import-position
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\site-packages\allennlp\commands\__init__.py", line 8, in <module>
from allennlp.commands.configure import Configure
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\site-packages\allennlp\commands\configure.py", line 27, in <module>
from allennlp.service.config_explorer import make_app
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\site-packages\allennlp\service\config_explorer.py", line 24, in <module>
from allennlp.common.configuration import configure, choices
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\site-packages\allennlp\common\__init__.py", line 1, in <module>
from allennlp.common.from_params import FromParams
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\site-packages\allennlp\common\from_params.py", line 48, in <module>
from allennlp.common.params import Params
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\site-packages\allennlp\common\params.py", line 173, in <module>
class Params(MutableMapping):
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\site-packages\allennlp\common\params.py", line 236, in Params
def pop(self, key: str, default: Any = DEFAULT) -> Any:
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\site-packages\overrides\overrides.py", line 67, in overrides
return _overrides(method, check_signature, check_at_runtime)
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\site-packages\overrides\overrides.py", line 93, in _overrides
_validate_method(method, super_class, check_signature)
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\site-packages\overrides\overrides.py", line 114, in _validate_method
ensure_signature_is_compatible(super_method, method, is_static)
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\site-packages\overrides\signature.py", line 87, in ensure_signature_is_compatible
super_sig, sub_sig, super_type_hints, sub_type_hints, is_static, method_name
File "c:\users\604194\appdata\local\continuum\anaconda3\envs\domains\lib\site-packages\overrides\signature.py", line 213, in ensure_all_positional_args_defined_in_sub
f"`{method_name}: {sub_param.name} must be a supertype of `{super_param.annotation}` but is `{sub_param.annotation}`"
TypeError: `Params.pop: key must be a supertype of `<class 'inspect._empty'>` but is `<class 'str'>`
Any advice on how to resolve this issue? Thanks!
EDIT: Edited my question for more information if needed. Thanks!

ERROR: Could not install packages due to an OSError: [Errno 22]

Hi im trying to install voevent-parse via pip and run into the following error:
ERROR: Could not install packages due to an OSError. Traceback (most recent call last): File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_vendor\resolvelib\resolvers.py", line 171, in _merge_into_criterion
crit = self.state.criteria[name] KeyError: 'voevent-parse'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\commands\install.py", line 316, in run
requirement_set = resolver.resolve(
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\resolution\resolvelib\resolver.py", line 121, in resolve
self._result = resolver.resolve(
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_vendor\resolvelib\resolvers.py", line 453, in resolve
state = resolution.resolve(requirements, max_rounds=max_rounds)
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_vendor\resolvelib\resolvers.py", line 318, in resolve
name, crit = self._merge_into_criterion(r, parent=None)
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_vendor\resolvelib\resolvers.py", line 173, in _merge_into_criterion
crit = Criterion.from_requirement(self._p, requirement, parent)
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_vendor\resolvelib\resolvers.py", line 82, in from_requirement
if not cands:
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_vendor\resolvelib\structs.py", line 124, in __bool__
return bool(self._sequence)
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\resolution\resolvelib\found_candidates.py", line 99, in __bool__
return any(self)
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\resolution\resolvelib\factory.py", line 237, in iter_index_candidates
candidate = self._make_candidate_from_link(
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\resolution\resolvelib\factory.py", line 165, in _make_candidate_from_link
self._link_candidate_cache[link] = LinkCandidate(
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\resolution\resolvelib\candidates.py", line 300, in __init__
super().__init__(
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\resolution\resolvelib\candidates.py", line 144, in __init__
self.dist = self._prepare()
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\resolution\resolvelib\candidates.py", line 226, in _prepare
dist = self._prepare_distribution()
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\resolution\resolvelib\candidates.py", line 311, in _prepare_distribution
return self._factory.preparer.prepare_linked_requirement(
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\operations\prepare.py", line 457, in prepare_linked_requirement
return self._prepare_linked_requirement(req, parallel_builds)
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\operations\prepare.py", line 480, in _prepare_linked_requirement
local_file = unpack_url(
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\operations\prepare.py", line 240, in unpack_url
unpack_file(file.path, location, file.content_type)
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\utils\unpacking.py", line 266, in unpack_file
untar_file(filename, location)
File "c:\users\timo\appdata\local\programs\python\python39\lib\site-packages\pip\_internal\utils\unpacking.py", line 230, in untar_file
with open(path, 'wb') as destfp:
OSError: [Errno 22] Invalid argument: 'C:\\Users\\Timo\\AppData\\Local\\Temp\\pip-install-wpb7xknq\\voevent-parse_090e9cbdcae64f31a771cb0698293f1a\\src/voeventparse/fixtures/MOA_Lensing_Event_2015-07-10T14:50:54.00.xml'
Removed build tracker: 'C:\\Users\\Timo\\AppData\\Local\\Temp\\pip-req-tracker-cmjrk9m4'
Im using the latest Windows 10 version and latest pip version
Please consider running your command line in administrator

Pipeline Loading Models and Tokenizers for Q&A

Hi I'm trying to use 'fmikaelian/flaubert-base-uncased-squad' for question answering. I understand that I should load the model and the tokenizers. I'm not sure how should I do this.
My code is basically far
from transformers import pipeline, BertTokenizer
nlp = pipeline('question-answering', \
model='fmikaelian/flaubert-base-uncased-squad', \
tokenizer='fmikaelian/flaubert-base-uncased-squad')
Most probably this can be solve with a two liner.
Many thanks
EDIT
I have also tried to use automodels but it seems those are not there:
OSError: Model name 'flaubert-base-uncased-squad' was not found in model name list (bert-base-uncased, bert-large-uncased, bert-base-cased, bert-large-cased, bert-base-multilingual-uncased, bert-base-multilingual-cased, bert-base-chinese, bert-base-german-cased, bert-large-uncased-whole-word-masking, bert-large-cased-whole-word-masking, bert-large-uncased-whole-word-masking-finetuned-squad, bert-large-cased-whole-word-masking-finetuned-squad, bert-base-cased-finetuned-mrpc, bert-base-german-dbmdz-cased, bert-base-german-dbmdz-uncased). We assumed 'flaubert-base-uncased-squad' was a path or url to a configuration file named config.json or a directory containing such a file but couldn't find any such file at this path or url.
EDIT II
I tried following the approach suggested with the following code that loads models that have been saved from S3:
tokenizer_ = FlaubertTokenizer.from_pretrained(MODELS)
model_ = FlaubertModel.from_pretrained(MODELS)
p = transformers.QuestionAnsweringPipeline(
model=transformers.AutoModel.from_pretrained(MODELS),
tokenizer=transformers.AutoTokenizer.from_pretrained(MODELS)
)
question_="Quel est le montant de la garantie?"
language_="French"
context_="le montant de la garantie est € 1000"
output=p({'question':question_, 'context': context_})
print(output)
Unfortunately I have been getting the following error:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
Traceback (most recent call last):
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\spawn.py", line 114, in _main
File "question_extraction.py", line 61, in <module>
prepare(preparation_data)
output=p({'question':question_, 'context': context_}) File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\spawn.py", line 225, in prepare
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\site-packages\transformers\pipelines.py", line 802, in __call__
_fixup_main_from_path(data['init_main_from_path'])
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
run_name="__mp_main__")
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\runpy.py", line 263, in run_path
pkg_name=pkg_name, script_name=fname)
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\Users\... ...\Box Sync\nlp - 2...\NLP\src\question_extraction.py", line 61, in <module>
output=p({'question':question_, 'context': context_})
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\site-packages\transformers\pipelines.py", line 802, in __call__
for example in examples
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\site-packages\transformers\pipelines.py", line 802, in <listcomp>
for example in examples
for example in examples File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\site-packages\transformers\pipelines.py", line 802, in <listcomp>
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\site-packages\transformers\data\processors\squad.py", line 304, in squad_convert_examples_to_features
for example in examples
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\site-packages\transformers\data\processors\squad.py", line 304, in squad_convert_examples_to_features
with Pool(threads, initializer=squad_convert_example_to_features_init, initargs=(tokenizer,)) as p:with Pool(threads, initializer=squad_convert_example_to_features_init, initargs=(tokenizer,)) as p:
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\context.py", line 119, in Pool
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\context.py", line 119, in Pool
context=self.get_context())context=self.get_context())
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\pool.py", line 174, in __init__
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\pool.py", line 174, in __init__
self._repopulate_pool()self._repopulate_pool()
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\pool.py", line 239, in _repopulate_pool
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\pool.py", line 239, in _repopulate_pool
w.start()
w.start()
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\process.py", line 105, in start
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\process.py", line 105, in start
self._popen = self._Popen(self)
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\context.py", line 322, in _Popen
self._popen = self._Popen(self)
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\context.py", line 322, in _Popen
return Popen(process_obj)
return Popen(process_obj) File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\popen_spawn_win32.py", line 33, in __init__
prep_data = spawn.get_preparation_data(process_obj._name)reduction.dump(process_obj, to_child)
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\spawn.py", line 143, in get_preparation_data
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\reduction.py", line 60, in dump
_check_not_importing_main()
File "C:\Users\... ...\AppData\Local\Continuum\Anaconda3\envs\nlp_nlp\lib\multiprocessing\spawn.py", line 136, in _check_not_importing_main
is not going to be frozen to produce an executable.''')
RuntimeError:
An attempt has been made to start a new process before the
current process has finished its bootstrapping phase.
This probably means that you are not using fork to start your
child processes and you have forgotten to use the proper idiom
in the main module:
if __name__ == '__main__':
freeze_support()
...
The "freeze_support()" line can be omitted if the program
is not going to be frozen to produce an executable.
ForkingPickler(file, protocol).dump(obj)
BrokenPipeError: [Errno 32] Broken pipe
*EDIT IV *
I solved the previous EDIT error by placing the functions inside the "main".
Unfortunately when I run the following code:
tokenizer_ = FlaubertTokenizer.from_pretrained(MODELS)
model_ = FlaubertModel.from_pretrained(MODELS)
def question_extraction(text, question, model, tokenizer, language="French", verbose=False):
if language=="French":
nlp = pipeline('question-answering', \
model=model, \
tokenizer=tokenizer)
else:
nlp=pipeline('question-answering')
output=nlp({'question':question, 'context': text})
answer, score = output.answer, output.score
if verbose==True:
print("Q: ", question ,"\n",\
"A:", answer,"\n", \
"Confidence (%):", "{0:.2f}".format(str(score*100) )
)
return answer, score
if __name__=="__main__":
question_="Quel est le montant de la garantie?"
language_="French"
text="le montant de la garantie est € 1000"
answer, score=question_extraction(text, question_, model_, tokenizer_, language_, verbose= True)
I'm getting the following error:
C:\...\NLP\src>python question_extraction.py
OK
OK
convert squad examples to features: 100%|████████████████████████████████████████████████| 1/1 [00:00<00:00, 4.66it/s]
add example index and unique id: 100%|███████████████████████████████████████████████████████████| 1/1 [00:00<?, ?it/s]
Traceback (most recent call last):
File "question_extraction.py", line 77, in <module>
answer, score=question_extraction(text, question_, model_, tokenizer_, language_, verbose= True)
File "question_extraction.py", line 60, in question_extraction
output=nlp({'question':question, 'context': text})
File "C:\...\transformers\pipelines.py", line 818, in __call__
start, end = self.model(**fw_args)
ValueError: not enough values to unpack (expected 2, got 1)
As stated in the source, there is a specific QuestionAnsweringPipeline. Below example is what I used to successfully load the Flaubert model.
import transformers as trf
p = trf.QuestionAnsweringPipeline(model=trf.AutoModel.from_pretrained("fmikaelian/flaubert-base-uncased-squad"), tokenizer=trf.AutoTokenizer.from_pretrained("fmikaelian/flaubert-base-uncased-squad"))
Of course, there is also the alternative to use the pre-trained model FlaubertForQuestionAnswering, since pipelines just got released with the latest release and might be subject to change.

KeyError: 'prefs_main_template'

I tried to install custom package.
That is made for Plone4 with archetypes.
Access http://0.0.0.0:8086/Plone/##site-controlpanel appears this errors.
What is this mean?
2017-10-31 11:50:06 ERROR Zope.SiteErrorLog 1509450606.040.577270977101 http://0.0.0.0:8086/Plone/##site-controlpanel
Traceback (innermost last):
Module ZPublisher.Publish, line 138, in publish
Module ZPublisher.mapply, line 77, in mapply
Module ZPublisher.Publish, line 48, in call_object
Module plone.z3cform.layout, line 67, in __call__
Module plone.z3cform.layout, line 84, in render
Module Products.Five.browser.pagetemplatefile, line 125, in __call__
Module Products.Five.browser.pagetemplatefile, line 59, in __call__
Module zope.pagetemplate.pagetemplate, line 132, in pt_render
Module five.pt.engine, line 98, in __call__
Module z3c.pt.pagetemplate, line 163, in render
Module chameleon.zpt.template, line 261, in render
Module chameleon.template, line 191, in render
Module chameleon.template, line 171, in render
Module e50eb39ab1e3e2ab91a240ce2e2f728e.py, line 233, in render
Module five.pt.expressions, line 154, in __call__
Module five.pt.expressions, line 123, in traverse
Module OFS.Traversable, line 313, in unrestrictedTraverse
- __traceback_info__: ([], 'prefs_main_template')
KeyError: 'prefs_main_template'
- Expression: "here/prefs_main_template/macros/master"
- Filename: ... .7.egg/plone/app/registry/browser/controlpanel_layout.pt
- Location: (line 6: col 23)
- Source: ... al:use-macro="here/prefs_main_template/macros/master"
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
- Arguments: repeat: {...} (0)
template: <ViewPageTemplateFile - at 0xb3b0e9ecL>
views: <ViewMapper - at 0xb0e1a30cL>
modules: <instance - at 0xb6393b8cL>
args: <tuple - at 0xb74cd02cL>
here: <ImplicitAcquisitionWrapper Plone at 0xb11b76e4L>
user: <ImplicitAcquisitionWrapper - at 0xb114b52cL>
nothing: <NoneType - at 0x83132e4>
container: <ImplicitAcquisitionWrapper Plone at 0xb11b76e4L>
request: <instance - at 0xb150ab0cL>
wrapped_repeat: <SafeMapping - at 0xb1707fa4L>
traverse_subpath: <list - at 0xb1ad51ccL>
default: <object - at 0xb74f3a18L>
loop: {...} (0)
context: <ImplicitAcquisitionWrapper Plone at 0xb11b76e4L>
view: <SiteControlPanel site-controlpanel at 0xb1576becL>
translate: <function translate at 0xb30eea04L>
root: <ImplicitAcquisitionWrapper Zope at 0xb36ca9b4L>
options: {...} (0)
target_language: <NoneType - at 0x83132e4>
Then I add break point at Zope2-2.13.26-py2.7.egg/OFS/Traversable.py
if getattr(aq_base(obj), name, _marker) is not _marker:
if restricted:
next = guarded_getattr(obj, name)
else:
import pdb; pdb.set_trace()
print obj, name, getattr(aq_base(obj), name, _marker)
next = getattr(obj, name)
The name is prefs_main_template.
The obj is PloneSite.
And getattr returns None.
To fix that error only, you can add a copy of the original template into one of your skin-folders.
To get the original template, go to https://github.com/plone/Products.CMFPlone/blob/master/Products/CMFPlone/skins/plone_prefs/prefs_main_template.pt and choose the tag according to the Plone-vs you're using in the dropdown-menu where it says "Branch: master".
However, the real problem is that Plone's skin-folders have been deleted.
Besides one shouldn't do that, you'd either need to recreate missing skin-files or consider creating a new Plonesite with same add-ons installed and move the content from the old to the new site.

Celery not starting in OS X - dbm.error: db type is dbm.gnu, but the module is not available

I'm trying to run celery worker in OS X (Mavericks). I activated virtual environment (python 3.4) and tried to start Celery with this argument:
celery worker --app=scheduling -linfo
Where scheduling is my celery app.
But I ended up with this error: dbm.error: db type is dbm.gnu, but the module is not available
Complete stacktrace:
Traceback (most recent call last):
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/kombu/utils/__init__.py", line 320, in __get__
return obj.__dict__[self.__name__]
KeyError: 'db'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/other/PhoenixEnv/bin/celery", line 9, in <module>
load_entry_point('celery==3.1.9', 'console_scripts', 'celery')()
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/__main__.py", line 30, in main
main()
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/bin/celery.py", line 80, in main
cmd.execute_from_commandline(argv)
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/bin/celery.py", line 768, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/bin/base.py", line 308, in execute_from_commandline
return self.handle_argv(self.prog_name, argv[1:])
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/bin/celery.py", line 760, in handle_argv
return self.execute(command, argv)
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/bin/celery.py", line 692, in execute
).run_from_argv(self.prog_name, argv[1:], command=argv[0])
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/bin/worker.py", line 175, in run_from_argv
return self(*args, **options)
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/bin/base.py", line 271, in __call__
ret = self.run(*args, **kwargs)
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/bin/worker.py", line 209, in run
).start()
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/worker/__init__.py", line 100, in __init__
self.setup_instance(**self.prepare_args(**kwargs))
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/worker/__init__.py", line 141, in setup_instance
self.blueprint.apply(self, **kwargs)
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/bootsteps.py", line 221, in apply
step.include(parent)
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/bootsteps.py", line 347, in include
return self._should_include(parent)[0]
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/bootsteps.py", line 343, in _should_include
return True, self.create(parent)
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/worker/components.py", line 220, in create
w._persistence = w.state.Persistent(w.state, w.state_db, w.app.clock)
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/worker/state.py", line 161, in __init__
self.merge()
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/worker/state.py", line 169, in merge
self._merge_with(self.db)
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/kombu/utils/__init__.py", line 322, in __get__
value = obj.__dict__[self.__name__] = self.__get(obj)
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/worker/state.py", line 238, in db
return self.open()
File "/Users/other/PhoenixEnv/lib/python3.4/site-packages/celery/worker/state.py", line 165, in open
self.filename, protocol=self.protocol, writeback=True,
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/shelve.py", line 239, in open
return DbfilenameShelf(filename, flag, protocol, writeback)
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/shelve.py", line 223, in __init__
Shelf.__init__(self, dbm.open(filename, flag), protocol, writeback)
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/dbm/__init__.py", line 91, in open
"available".format(result))
dbm.error: db type is dbm.gnu, but the module is not available
Please help.
I switched to python3.5 and got the same error. On Ubuntu I could fix it with
aptitude install python3.5-gdbm
I got the same error. On Macbook I could fix it with
brew install gdb

Resources