Accept retrieving less fields than requested in MARS Web API? - download

I'm trying to download a 25 day ahead forecast from the ECMWF MARS Web API for all of 2018. These forecasts (WAEF Control Forecast) are only published on mondays and thursdays, and here I'm running into problems fetching the data using the MARS Web API.
I tried requesting the intuitive 2018-01-01/to/2018-12-31, but since there are 5 days a week where there aren't any fields to retrieve, the request fails.
My MARS request file is as follows:
retrieve,
class=od,
date=2018-01-01/to/2018-12-31,
expver=1,
param=229.140/245.140,
step=600/624/648/672,
stream=waef,
time=00:00:00,
type=cf,
target="output.grib"
Which results in the following response:
...
mars - INFO - 20190215.100826 - Welcome to MARS
mars - INFO - 20190215.100826 - MARS Client build stamp: 20190130224336
mars - INFO - 20190215.100826 - MARS Client version: 6.23.3
mars - INFO - 20190215.100826 - MIR version: 1.1.2
mars - INFO - 20190215.100826 - Using ecCodes version 2.10.1
mars - INFO - 20190215.100826 - Using odb_api version: 0.15.9 (file format version: 0.5)
mars - INFO - 20190215.100826 - Maximum retrieval size is 30.00 G
retrieve,target="output.grib",stream=waef,param=229.140/245.140,padding=0,step=600/624/648/672,expver=1,time=00:00:00,date=2018-01-01/to/2018-12-31,type=cf,class=odmars - WARN - 20190215.100826 - For wave data, LEVTYPE forced to Surface
mars - INFO - 20190215.100826 - Automatic split by date is on
mars - INFO - 20190215.100826 - Request has been split into 12 monthly retrievals
mars - INFO - 20190215.100826 - Processing request 1
RETRIEVE,
CLASS = OD,
TYPE = CF,
STREAM = WAEF,
EXPVER = 0001,
REPRES = SH,
LEVTYPE = SFC,
PARAM = 229.140/245.140,
TIME = 0000,
STEP = 600/624/648/672,
DOMAIN = G,
TARGET = "output.grib",
PADDING = 0,
DATE = 20180101/20180102/20180103/20180104/20180105/20180106/20180107/20180108/20180109/20180110/20180111/20180112/20180113/20180114/20180115/20180116/20180117/20180118/20180119/20180120/20180121/20180122/20180123/20180124/20180125/20180126/20180127/20180128/20180129/20180130/20180131
mars - INFO - 20190215.100826 - Web API request id: xxx
mars - INFO - 20190215.100826 - Requesting 248 fields
mars - INFO - 20190215.100826 - Calling mars on 'marsod', callback on 36551
mars - INFO - 20190215.100827 - Server task is 228 [marsod]
mars - INFO - 20190215.100827 - Request cost: 72 fields, 17.2754 Mbytes on 1 tape, nodes: hpss [marsod]
2019-02-15 11:08:59 Request is active
mars - INFO - 20190215.102300 - Transfering 18114554 bytes
mars - WARN - 20190215.102301 - Visiting database marsod : expected 248, got 72
mars - ERROR - 20190215.102301 - Expected 248, got 72.
mars - ERROR - 20190215.102301 - Request failed
...
Is there any way to allow receiving less fields than requested or any other elegant solution to this problem other than only requesting the correct dates for mondays and thursdays?

I managed to find the answer in the MARS documentation after all. Using expect = any in the control section solved the issue. More information can be found here: https://confluence.ecmwf.int/pages/viewpage.action?pageId=43521134
retrieve,
class=od,
date=2018-01-01/to/2018-12-31,
expver=1,
param=229.140/245.140,
step=600/624/648/672,
stream=waef,
time=00:00:00,
type=cf,
expect=any,
target="output.grib"

Related

Problems with Cypress in a Windows PC

I have installed Cypress in two PCs with similar features. On the first one Cypress works, it's fast and doesn't give me any problems. On the second one I'm having difficulties:
The cy.visit command takes a very long time to reach my site, even 60 seconds. Actually the site has no problem of reachability
if I close the test execution and relaunch it, the test runner opens in an "about.blank" page
I've tried everything, uninstalled Cypress, cleared cypress cache. I have uninstalled node.js and npm. I then reinstalled the whole thing. But the problems remain. If you need more information I remain available. Below what is printed in powershell during the first test that contains a "cy.visit" (it took more than 80 seconds in all)
GET /__/ 200 45.608 ms - -
GET /__cypress/runner/cypress_runner.css 200 68.125 ms - -
GET /chrome-variations/seed?osname=win&channel=stable&milestone=89 304 652.798 ms - -
POST /ListAccounts?gpsia=1&source=ChromiumBrowser&json=standard 200 604.179 ms - -
GET /__cypress/runner/cypress_runner.js 200 71.181 ms - -
GET /__cypress/static/favicon.ico 200 4.437 ms - -
GET /__cypress/iframes/integration/pr/te.js 200 7.769 ms - 843
GET /__cypress/runner/fonts/fa-solid-900.woff2 200 3.737 ms - 76120
GET /v1/pages/ChRDaHJvbWUvODkuMC40Mzg5LjExNBIQCcmLdcaTM2VsEgUNkWGVThIXCQPNs0E5ZBKtEgUNL4g-ghIFDZFhlU4=?alt=proto 200 385.975 ms - -
GET /__cypress/tests?p=cypress%5Cintegration%5Cpr%5Cte.js 200 8102.285 ms - -
GET /__cypress/tests?p=cypress%5Csupport%5Cindex.js 200 8122.208 ms - -
GET /__/ 200 1.715 ms - -
GET /__cypress/runner/cypress_runner.css 200 1.567 ms - -
GET /__cypress/runner/cypress_runner.js 200 4.923 ms - -
GET /__cypress/iframes/integration/pr/te.js 200 2.085 ms - 849
GET /__cypress/runner/fonts/fa-solid-900.woff2 200 2.107 ms - 76120
GET /__cypress/tests?p=cypress%5Csupport%5Cindex.js 200 9.996 ms - -
GET /__cypress/tests?p=cypress%5Cintegration%5Cpr%5Cte.js 200 5.663 ms - -
GET /v1/pages/ChRDaHJvbWUvODkuMC40Mzg5LjExNBIQCXTM6vS20UQ9EgUNkWGVThIXCRevpbqZx6oNEgUNL4g-ghIFDZFhlU4=?alt=proto 200 87.250 ms - -
GET /men/1-1-hummingbird-printed-t-shirt.html 200 201.834 ms - -
GET /themes/classic/assets/css/theme.css 200 358.023 ms - -
GET /js/jquery/ui/themes/base/minified/jquery.ui.theme.min.css 200 221.726 ms - -
GET /modules/ps_imageslider/css/homeslider.css 200 299.205 ms - -
GET /themes/classic/assets/css/custom.css 200 294.395 ms - -
GET /themes/core.js 200 177.118 ms - -
GET /themes/classic/assets/js/theme.js 200 537.278 ms - -
GET /modules/ps_emailsubscription/views/js/ps_emailsubscription.js 200 148.071 ms - -
GET /js/jquery/ui/jquery-ui.min.js 200 281.471 ms - -
GET /modules/ps_imageslider/js/responsiveslides.min.js 200 237.210 ms - -
GET /modules/ps_imageslider/js/homeslider.js 200 388.981 ms - -
GET /modules/ps_searchbar/ps_searchbar.js 200 186.480 ms - -
GET /modules/ps_shoppingcart/ps_shoppingcart.js 200 272.364 ms - -
GET /themes/classic/assets/js/custom.js 200 235.551 ms - -
GET /img/logo.png 200 219.762 ms - -
GET /2-large_default/hummingbird-printed-t-shirt.jpg 200 1297.510 ms - -
GET /2-home_default/hummingbird-printed-t-shirt.jpg 200 164.662 ms - -
GET /modules/blockreassurance/img/ic_verified_user_black_36dp_1x.png 200 217.993 ms - -
GET /modules/blockreassurance/img/ic_local_shipping_black_36dp_1x.png 200 160.820 ms - -
GET /modules/blockreassurance/img/ic_swap_horiz_black_36dp_1x.png 200 147.800 ms - -
GET /img/m/1.jpg 200 115.767 ms - -
GET /js/jquery/ui/themes/base/minified/jquery-ui.min.css 200 12482.121 ms - -
GET /themes/classic/assets/css/082a71677e756fb75817e8f262a07cb4.svg 200 133.134 ms - -
GET /themes/classic/assets/css/8b05d51ede908907d65695558974d86f.svg 200 135.111 ms - -
GET /js/jquery/ui/themes/base/minified/images/ui-bg_flat_75_ffffff_40x100.png 200 193.227 ms - -
GET /themes/classic/assets/css/b1db819132e64a3e01911a1413c33acf.svg 200 499.059 ms - -
GET /themes/classic/assets/css/570eb83859dc23dd0eec423a49e147fe.woff2 200 228.871 ms - -
GET /themes/classic/assets/css/19c1b868764c0e4d15a45d3f61250488.woff2 200 268.473 ms - -
GET /themes/classic/assets/css/199038f07312bfc6f0aabd3ed6a2b64d.woff2 200 230.125 ms - -
GET /v1/pages/ChRDaHJvbWUvODkuMC40Mzg5LjExNBIQCSH1MkjtAoVnEgUNu1dWahIXCbpTXag6Wh1tEgUNjsJ8QRIFDctunW4SEAl5ngwDYftELxIFDYOoWz0=?alt=proto 200 101.980 ms - -
GET /themes/classic/assets/css/ffddcb3736980b23405b31142a324b62.svg 200 13351.874 ms - -
GET /themes/classic/assets/css/99db8adec61e4fcf5586e1afa549b432.svg 200 13350.633 ms - -
GET /themes/classic/assets/css/e049aeb07a2ae1627933e8e58d3886d2.svg 200 13445.685 ms - -
POST /carrello 200 389.199 ms - -
POST /module/ps_shoppingcart/ajax 200 388.407 ms - -
GET /__cypress/runner/fonts/fa-regular-400.woff2 200 6.671 ms - 13600
GET /carrello?action=show 200 443.053 ms - -
GET /js/jquery/ui/themes/base/minified/jquery.ui.theme.min.css 200 133.347 ms - -
GET /js/jquery/ui/themes/base/minified/jquery-ui.min.css 200 149.125 ms - -
GET /themes/classic/assets/css/custom.css 200 145.832 ms - -
GET /themes/classic/assets/css/theme.css 200 165.264 ms - -
EDIT: thanks to the advice received from the chat https://gitter.im/cypress-io/cypress I concluded that the problem was related to Chrome. I uninstalled and reinstalled the browser and the problem was resolved.

I constantly get ResolvePackageNotFound

When I type conda env create -f environment.yml
I constantly get
Collecting package metadata (repodata.json): done Solving environment: failed
ResolvePackageNotFound:
- tk==8.6.8=hbc83047_0
- zlib==1.2.11=h7b6447c_3
- av==8.0.2=py37h06622b3_4
- lame==3.100=h7f98852_1001
- xz==5.2.4=h14c3975_4
- mkl_random==1.0.2=py37hd81dba3_0
- x264==1!152.20180806=h14c3975_0
- numpy-base==1.16.4=py37hde5b4d6_0
- certifi==2020.12.5=py37h06a4308_0
- _openmp_mutex==4.5=1_llvm
- llvm-openmp==11.0.0=hfc4b9b4_1
- freetype==2.9.1=h8a8886c_1
- scikit-learn==0.22.1=py37hd81dba3_0
- libgfortran-ng==7.3.0=hdf63c60_0
- readline==7.0=h7b6447c_5
- mkl_fft==1.0.12=py37ha843d7b_0
- libpng==1.6.37=hbc83047_0
- libedit==3.1.20181209=hc058e9b_0
- libffi==3.2.1=hd88cf55_4
- nettle==3.6=he412f7d_0
- gnutls==3.6.13=h85f3911_1
- python==3.7.3=h0371630_0
- gmp==6.2.1=h58526e2_0
- _libgcc_mutex==0.1=conda_forge
- libgcc-ng==9.3.0=h5dbcf3e_17
- mkl-service==2.3.0=py37he904b0f_0
- ffmpeg==4.3.1=h3215721_1
- openh264==2.1.1=h8b12597_0
- mkl==2019.4=243
- numpy==1.16.4=py37h7e9f1db_0
- ca-certificates==2020.12.8=h06a4308_0
- libiconv==1.16=h516909a_0
- intel-openmp==2019.4=243
- libstdcxx-ng==9.1.0=hdf63c60_0
- zstd==1.3.7=h0b5b093_0
- ncurses==6.1=he6710b0_1
- jpeg==9b=h024ee3a_2
- openssl==1.1.1i=h27cfd23_0
- bzip2==1.0.8=h7f98852_4
- sqlite==3.28.0=h7b6447c_0
- libtiff==4.0.10=h2733197_2
What should I do?
My yml file is:
name: StyleFlow
channels:
- anaconda
- defaults
- conda-forge
dependencies:
- _libgcc_mutex=0.1=conda_forge
- _openmp_mutex=4.5=1_llvm
- av=8.0.2=py37h06622b3_4
- blas=1.0=mkl
- bzip2=1.0.8=h7f98852_4
- ca-certificates=2020.12.8=h06a4308_0
- certifi=2020.12.5=py37h06a4308_0
- ffmpeg=4.3.1=h3215721_1
- freetype=2.9.1=h8a8886c_1
- gmp=6.2.1=h58526e2_0
- gnutls=3.6.13=h85f3911_1
- intel-openmp=2019.4=243
- joblib=0.14.1=py_0
- jpeg=9b=h024ee3a_2
- lame=3.100=h7f98852_1001
- libedit=3.1.20181209=hc058e9b_0
- libffi=3.2.1=hd88cf55_4
- libgcc-ng=9.3.0=h5dbcf3e_17
- libgfortran-ng=7.3.0=hdf63c60_0
- libiconv=1.16=h516909a_0
- libpng=1.6.37=hbc83047_0
- libstdcxx-ng=9.1.0=hdf63c60_0
- libtiff=4.0.10=h2733197_2
- llvm-openmp=11.0.0=hfc4b9b4_1
- mkl=2019.4=243
- mkl-service=2.3.0=py37he904b0f_0
- mkl_fft=1.0.12=py37ha843d7b_0
- mkl_random=1.0.2=py37hd81dba3_0
- natsort=6.0.0=py_0
- ncurses=6.1=he6710b0_1
- nettle=3.6=he412f7d_0
- numpy=1.16.4=py37h7e9f1db_0
- numpy-base=1.16.4=py37hde5b4d6_0
- olefile=0.46=py37_0
- openh264=2.1.1=h8b12597_0
- openssl=1.1.1i=h27cfd23_0
- pip=19.1.1=py37_0
- python=3.7.3=h0371630_0
- python_abi=3.7=1_cp37m
- readline=7.0=h7b6447c_5
- scikit-learn=0.22.1=py37hd81dba3_0
- setuptools=41.0.1=py37_0
- sqlite=3.28.0=h7b6447c_0
- tk=8.6.8=hbc83047_0
- wheel=0.33.4=py37_0
- x264=1!152.20180806=h14c3975_0
- xz=5.2.4=h14c3975_4
- zlib=1.2.11=h7b6447c_3
- zstd=1.3.7=h0b5b093_0
- pip:
- absl-py==0.7.1
- appdirs==1.4.4
- astor==0.8.0
- astunparse==1.6.3
- attrs==19.1.0
- backcall==0.1.0
- bleach==3.1.0
- cachetools==4.1.0
- cffi==1.12.3
- chardet==3.0.4
- cloudpickle==1.2.1
- cycler==0.10.0
- cytoolz==0.9.0.1
- dask==2.1.0
- decorator==4.4.0
- defusedxml==0.6.0
- deprecated==1.2.6
- dill==0.2.9
- dlib==19.21.0
- dominate==2.3.5
- easydict==1.9
- entrypoints==0.3
- gast==0.2.2
- google-auth==1.14.3
- google-auth-oauthlib==0.4.1
- google-pasta==0.2.0
- grpcio==1.22.0
- h5py==2.10.0
- helpdev==0.6.10
- idna==2.8
- imageio==2.5.0
- importlib-metadata==0.18
- imutils==0.5.3
- ipykernel==5.1.1
- ipython==7.6.0
- ipython-genutils==0.2.0
- ipywidgets==7.4.2
- jedi==0.13.3
- jinja2==2.10.1
- jsonschema==3.0.1
- jupyter==1.0.0
- jupyter-client==5.2.4
- jupyter-console==6.0.0
- jupyter-core==4.5.0
- keras==2.2.4
- keras-applications==1.0.8
- keras-preprocessing==1.1.0
- kiwisolver==1.1.0
- mako==1.1.2
- markdown==3.1.1
- markupsafe==1.1.1
- matplotlib==3.1.0
- mistune==0.8.4
- nbconvert==5.5.0
- nbformat==4.4.0
- networkx==2.3
- notebook==5.7.8
- oauthlib==3.1.0
- opencv-python==4.1.0.25
- opt-einsum==3.2.1
- pandocfilters==1.4.2
- parso==0.5.0
- pexpect==4.7.0
- pickleshare==0.7.5
- pillow==6.0.0
- prometheus-client==0.7.1
- prompt-toolkit==2.0.9
- protobuf==3.8.0
- psutil==5.6.3
- ptyprocess==0.6.0
- pyasn1==0.4.8
- pyasn1-modules==0.2.8
- pycparser==2.19
- pycuda==2019.1.2
- pygments==2.4.2
- pyparsing==2.4.0
- pyqt5==5.13.0
- pyqt5-sip==4.19.18
- pyrsistent==0.14.11
- pyside2==5.13.0
- python-dateutil==2.8.0
- pytools==2020.1
- pytz==2019.1
- pywavelets==1.0.3
- pyyaml==5.1.1
- pyzmq==18.0.0
- qdarkgraystyle==1.0.2
- qdarkstyle==2.7
- qtconsole==4.5.1
- requests==2.22.0
- requests-oauthlib==1.3.0
- rsa==4.0
- scikit-image==0.15.0
- scikit-video==1.1.11
- scipy==1.2.1
- send2trash==1.5.0
- shiboken2==5.13.0
- six==1.12.0
- tensorboard==1.15.0
- tensorboard-plugin-wit==1.6.0.post3
- tensorflow-estimator==1.15.1
- tensorflow-gpu==1.15.0
- termcolor==1.1.0
- terminado==0.8.2
- testpath==0.4.2
- toolz==0.9.0
- torch==1.1.0
- torchdiffeq==0.0.1
- torchvision==0.3.0
- tornado==6.0.3
- tqdm==4.32.1
- traitlets==4.3.2
- urllib3==1.25.3
- wcwidth==0.1.7
- webencodings==0.5.1
- werkzeug==0.15.4
- widgetsnbextension==3.4.2
- wrapt==1.11.2
- zipp==0.5.2
Conda does not work well with large environments in which everything pinned to specific versions (in contrast to other ecosystems in which pinning everything is the standard). The result of conda env export, which is what this probably is, here also includes the build numbers, which are almost always too specific (and often platform-specific) for the purpose of installing the right version of the software. It's great for things like reproducibility of scientific work (specific versions and builds of everything need to be known), but not great for installing software (there is plenty of flexibility in versions that should work with any package).
I'd start by removing the build pins (dropping everything after the second = in each line) so that only the versions are pinned. After that, I'd start removing version pins.
Use Ubuntu 18.04 x86 (linux-64) and the environment.yml provided will work.
It fails on MacOS (M1 Silicon).
As has been pointed out in the other reply; exported environment files with explicit build numbers, and as it turns out fixed package version combinations; may not work if this host platform is different.
Exported environments are a great way to get reproducible environments, but the build platform must be the same. I suggest using conda info on the origin host and on the target host to check if they are the same.

Not able to install dependencies using conda

I'm trying to install dependencies for tensor flow development. For that I'm using a yml file tfdl_env.yml. Used the conda env create, so that it was supposed to install all the dependencies.
conda env create -f tfdl_env.yml
But there it was shown that the Solving environment: failed and RequiredPackageNotFound.
The yml files used is below.
name: tfdeeplearning
channels:
- defaults
dependencies:
- bleach=1.5.0=py35_0
- certifi=2016.2.28=py35_0
- colorama=0.3.9=py35_0
- cycler=0.10.0=py35_0
- decorator=4.1.2=py35_0
- entrypoints=0.2.3=py35_0
- html5lib=0.9999999=py35_0
- icu=57.1=vc14_0
- ipykernel=4.6.1=py35_0
- ipython=6.1.0=py35_0
- ipython_genutils=0.2.0=py35_0
- ipywidgets=6.0.0=py35_0
- jedi=0.10.2=py35_2
- jinja2=2.9.6=py35_0
- jpeg=9b=vc14_0
- jsonschema=2.6.0=py35_0
- jupyter=1.0.0=py35_3
- jupyter_client=5.1.0=py35_0
- jupyter_console=5.2.0=py35_0
- jupyter_core=4.3.0=py35_0
- libpng=1.6.30=vc14_1
- markupsafe=1.0=py35_0
- matplotlib=2.0.2=np113py35_0
- mistune=0.7.4=py35_0
- mkl=2017.0.3=0
- nbconvert=5.2.1=py35_0
- nbformat=4.4.0=py35_0
- notebook=5.0.0=py35_0
- numpy=1.13.1=py35_0
- openssl=1.0.2l=vc14_0
- pandas=0.20.3=py35_0
- pandocfilters=1.4.2=py35_0
- path.py=10.3.1=py35_0
- pickleshare=0.7.4=py35_0
- pip=9.0.1=py35_1
- prompt_toolkit=1.0.15=py35_0
- pygments=2.2.0=py35_0
- pyparsing=2.2.0=py35_0
- pyqt=5.6.0=py35_2
- python=3.5.4=0
- python-dateutil=2.6.1=py35_0
- pytz=2017.2=py35_0
- pyzmq=16.0.2=py35_0
- qt=5.6.2=vc14_6
- qtconsole=4.3.1=py35_0
- requests=2.14.2=py35_0
- scikit-learn=0.19.0=np113py35_0
- scipy=0.19.1=np113py35_0
- setuptools=36.4.0=py35_1
- simplegeneric=0.8.1=py35_1
- sip=4.18=py35_0
- six=1.10.0=py35_1
- testpath=0.3.1=py35_0
- tk=8.5.18=vc14_0
- tornado=4.5.2=py35_0
- traitlets=4.3.2=py35_0
- vs2015_runtime=14.0.25420=0
- wcwidth=0.1.7=py35_0
- wheel=0.29.0=py35_0
- widgetsnbextension=3.0.2=py35_0
- win_unicode_console=0.5=py35_0
- wincertstore=0.2=py35_0
- zlib=1.2.11=vc14_0
- pip:
- ipython-genutils==0.2.0
- jupyter-client==5.1.0
- jupyter-console==5.2.0
- jupyter-core==4.3.0
- markdown==2.6.9
- prompt-toolkit==1.0.15
- protobuf==3.4.0
- tensorflow==1.3.0
- tensorflow-tensorboard==0.1.6
- werkzeug==0.12.2
- win-unicode-console==0.5
prefix: C:\Users\varun\Anaconda3\envs\tfdeeplearning
I'm using Anaconda 3 and conda version is 4.7.12. I'm on Windows10 machine. The purpose of this to install tensorflow along with all it's dependencies.
Same error for me. Also in Windows 10, with Anaconda 3 (2019.10) and Python 3.7 (all 64bit). Here is my output:
Collecting package metadata (repodata.json): done
Solving environment: failed
ResolvePackageNotFound:
- notebook==5.0.0=py35_0
- python-dateutil==2.6.1=py35_0
- wcwidth==0.1.7=py35_0
- testpath==0.3.1=py35_0
- libpng==1.6.30=vc14_1
- nbformat==4.4.0=py35_0
- tornado==4.5.2=py35_0
- numpy==1.13.1=py35_0
- setuptools==36.4.0=py35_1
- zlib==1.2.11=vc14_0
- html5lib==0.9999999=py35_0
- wheel==0.29.0=py35_0
- ipython==6.1.0=py35_0
- simplegeneric==0.8.1=py35_1
- ipykernel==4.6.1=py35_0
- colorama==0.3.9=py35_0
- jpeg==9b=vc14_0
- certifi==2016.2.28=py35_0
- scikit-learn==0.19.0=np113py35_0
- pip==9.0.1=py35_1
- ipython_genutils==0.2.0=py35_0
- jedi==0.10.2=py35_2
- tk==8.5.18=vc14_0
- mkl==2017.0.3=0
- icu==57.1=vc14_0
- pandas==0.20.3=py35_0
- qtconsole==4.3.1=py35_0
- widgetsnbextension==3.0.2=py35_0
- pickleshare==0.7.4=py35_0
- jupyter_console==5.2.0=py35_0
- bleach==1.5.0=py35_0
- jupyter_client==5.1.0=py35_0
- ipywidgets==6.0.0=py35_0
- openssl==1.0.2l=vc14_0
- pandocfilters==1.4.2=py35_0
- qt==5.6.2=vc14_6
- win_unicode_console==0.5=py35_0
- pytz==2017.2=py35_0
- pyzmq==16.0.2=py35_0
- pyqt==5.6.0=py35_2
- decorator==4.1.2=py35_0
- path.py==10.3.1=py35_0
- jupyter==1.0.0=py35_3
- jsonschema==2.6.0=py35_0
- markupsafe==1.0=py35_0
- requests==2.14.2=py35_0
- jupyter_core==4.3.0=py35_0
- entrypoints==0.2.3=py35_0
- six==1.10.0=py35_1
- cycler==0.10.0=py35_0
- mistune==0.7.4=py35_0
- scipy==0.19.1=np113py35_0
- traitlets==4.3.2=py35_0
- vs2015_runtime==14.0.25420=0
- wincertstore==0.2=py35_0
- matplotlib==2.0.2=np113py35_0
- nbconvert==5.2.1=py35_0
- python==3.5.4=0
- jinja2==2.9.6=py35_0
- pygments==2.2.0=py35_0
- prompt_toolkit==1.0.15=py35_0
- pyparsing==2.2.0=py35_0
- sip==4.18=py35_0
After several unsuccessful attempts to install from the provided tfdl_env.yml file, I desisted and just proceeded to install the needed packages myself with conda install <PACKAGE>. I also then found that some specified package versions in the provided file were not really up to date and conda was unable to find such versions. I'm actually quite dissapointed with this Anaconda environments system, as it seems to be just an "environment clone tool" for the very machine of the user who creates the environment, but they are definitely not portable at all, as one would expect.
However, maybe now I made it work in my Windows 10 you can try it as well. Here is an environment.yml file I created myself from my installation, which is fully working as far as I can tell (I'm already following Section 5 of the course):
name: tfdeeplearning
channels:
- defaults
dependencies:
- _tflow_select=2.2.0=eigen
- absl-py=0.9.0=py37_0
- asn1crypto=1.3.0=py37_0
- astor=0.8.0=py37_0
- attrs=19.3.0=py_0
- backcall=0.1.0=py37_0
- blas=1.0=mkl
- bleach=3.1.0=py37_0
- blinker=1.4=py37_0
- ca-certificates=2020.1.1=0
- cachetools=3.1.1=py_0
- certifi=2019.11.28=py37_0
- cffi=1.14.0=py37h7a1dbc1_0
- chardet=3.0.4=py37_1003
- click=7.0=py37_0
- colorama=0.4.3=py_0
- cryptography=2.8=py37h7a1dbc1_0
- cycler=0.10.0=py37_0
- decorator=4.4.1=py_0
- defusedxml=0.6.0=py_0
- entrypoints=0.3=py37_0
- freetype=2.9.1=ha9979f8_1
- gast=0.2.2=py37_0
- google-auth=1.11.2=py_0
- google-auth-oauthlib=0.4.1=py_2
- google-pasta=0.1.8=py_0
- grpcio=1.27.2=py37h351948d_0
- h5py=2.10.0=py37h5e291fa_0
- hdf5=1.10.4=h7ebc959_0
- icc_rt=2019.0.0=h0cc432a_1
- icu=58.2=ha66f8fd_1
- idna=2.8=py37_0
- importlib_metadata=1.5.0=py37_0
- intel-openmp=2020.0=166
- ipykernel=5.1.4=py37h39e3cac_0
- ipython=7.12.0=py37h5ca1d4c_0
- ipython_genutils=0.2.0=py37_0
- ipywidgets=7.5.1=py_0
- jedi=0.16.0=py37_0
- jinja2=2.11.1=py_0
- joblib=0.14.1=py_0
- jpeg=9b=hb83a4c4_2
- jsonschema=3.2.0=py37_0
- jupyter=1.0.0=py37_7
- jupyter_client=5.3.4=py37_0
- jupyter_console=6.1.0=py_0
- jupyter_core=4.6.1=py37_0
- keras-applications=1.0.8=py_0
- keras-preprocessing=1.1.0=py_1
- kiwisolver=1.1.0=py37ha925a31_0
- libpng=1.6.37=h2a8f88b_0
- libprotobuf=3.11.4=h7bd577a_0
- libsodium=1.0.16=h9d3ae62_0
- m2w64-gcc-libgfortran=5.3.0=6
- m2w64-gcc-libs=5.3.0=7
- m2w64-gcc-libs-core=5.3.0=7
- m2w64-gmp=6.1.0=2
- m2w64-libwinpthread-git=5.0.0.4634.697f757=2
- markdown=3.1.1=py37_0
- markupsafe=1.1.1=py37he774522_0
- matplotlib=3.1.3=py37_0
- matplotlib-base=3.1.3=py37h64f37c6_0
- mistune=0.8.4=py37he774522_0
- mkl=2020.0=166
- mkl-service=2.3.0=py37hb782905_0
- mkl_fft=1.0.15=py37h14836fe_0
- mkl_random=1.1.0=py37h675688f_0
- msys2-conda-epoch=20160418=1
- nbconvert=5.6.1=py37_0
- nbformat=5.0.4=py_0
- notebook=6.0.3=py37_0
- numpy=1.18.1=py37h93ca92e_0
- numpy-base=1.18.1=py37hc3f5095_1
- oauthlib=3.1.0=py_0
- openssl=1.1.1d=he774522_4
- opt_einsum=3.1.0=py_0
- pandas=1.0.1=py37h47e9c7a_0
- pandoc=2.2.3.2=0
- pandocfilters=1.4.2=py37_1
- parso=0.6.1=py_0
- pip=20.0.2=py37_1
- prometheus_client=0.7.1=py_0
- prompt_toolkit=3.0.3=py_0
- protobuf=3.11.4=py37h33f27b4_0
- pyasn1=0.4.8=py_0
- pyasn1-modules=0.2.7=py_0
- pycparser=2.19=py37_0
- pygments=2.5.2=py_0
- pyjwt=1.7.1=py37_0
- pyopenssl=19.1.0=py37_0
- pyparsing=2.4.6=py_0
- pyqt=5.9.2=py37h6538335_2
- pyreadline=2.1=py37_1
- pyrsistent=0.15.7=py37he774522_0
- pysocks=1.7.1=py37_0
- python=3.7.6=h60c2a47_2
- python-dateutil=2.8.1=py_0
- pytz=2019.3=py_0
- pywinpty=0.5.7=py37_0
- pyzmq=18.1.1=py37ha925a31_0
- qt=5.9.7=vc14h73c81de_0
- qtconsole=4.6.0=py_1
- requests=2.22.0=py37_1
- requests-oauthlib=1.3.0=py_0
- rsa=4.0=py_0
- scikit-learn=0.22.1=py37h6288b17_0
- scipy=1.4.1=py37h9439919_0
- send2trash=1.5.0=py37_0
- setuptools=45.2.0=py37_0
- sip=4.19.8=py37h6538335_0
- six=1.14.0=py37_0
- sqlite=3.31.1=he774522_0
- tensorboard=2.1.0=py3_0
- tensorflow=1.15.0=eigen_py37h9f89a44_0
- tensorflow-base=1.15.0=eigen_py37h07d2309_0
- tensorflow-estimator=1.15.1=pyh2649769_0
- termcolor=1.1.0=py37_1
- terminado=0.8.3=py37_0
- testpath=0.4.4=py_0
- tornado=6.0.3=py37he774522_3
- traitlets=4.3.3=py37_0
- urllib3=1.25.8=py37_0
- vc=14.1=h0510ff6_4
- vs2015_runtime=14.16.27012=hf0eaf9b_1
- wcwidth=0.1.8=py_0
- webencodings=0.5.1=py37_1
- werkzeug=0.16.1=py_0
- wheel=0.34.2=py37_0
- widgetsnbextension=3.5.1=py37_0
- win_inet_pton=1.1.0=py37_0
- wincertstore=0.2=py37_0
- winpty=0.4.3=4
- wrapt=1.11.2=py37he774522_0
- zeromq=4.3.1=h33f27b4_3
- zipp=2.2.0=py_0
- zlib=1.2.11=h62dcd97_3
- pip:
- ipython-genutils==0.2.0
- jupyter-client==6.0.0
- jupyter-core==4.6.3
- pickleshare==0.7.5
- pywin32==227
prefix: C:\Users\jose_\Anaconda3\envs\tfdeeplearning
Just copy the content to an environment.yml file in your box and do conda env create -f environment.yml.
Also, check out the last line prefix, where you'll have to modify the path to match yours (probably just substitute jose_). As I said before, this Conda environments tool doesn't seem suitable to create portable environments to be distributed to different machines.
Good luck.

Deploying dash app with gdal dependency fails on heroku

I'm using a conda for package management and including an environment.yml file and requirements.txt file for deployment with help from this post. I've gotten simple dash apps to deploy this way, but for a more complex task which requires GDAL, the build 'succeeds' but the app crashes with the following log:
Starting process with command `gunicorn app:server --log-file=-
heroku[web.1]: Process exited with status 127
app[web.1]: bash: gunicorn: command not found
app[api]: Build succeeded
My environment.yml file calls for:
name: dashpilot
channels:
- conda-forge
- anaconda-fusion
- defaults
dependencies:
- asn1crypto=0.24.0=py36_1003
- attrs=18.2.0=py_0
- bzip2=1.0.6=1
- ca-certificates=2018.10.15=ha4d7672_0
- certifi=2018.10.15=py36_1000
- cffi=1.11.5=py36h5e8e0c9_1
- chardet=3.0.4=py36_1003
- click=7.0=py_0
- cryptography-vectors=2.3.1=py36_1000
- dash=0.30.0=py_0
- dash-core-components=0.38.0=py_0
- dash-html-components=0.13.2=py_0
- dash-renderer=0.15.0=py_0
- decorator=4.3.0=py_0
- flask=1.0.2=py_2
- flask-compress=1.4.0=py_0
- geojson=2.4.1=py_0
- idna=2.7=py36_1002
- ipython_genutils=0.2.0=py_1
- itsdangerous=1.1.0=py_0
- jinja2=2.10=py_1
- jsonschema=3.0.0a3=py36_1000
- jupyter_core=4.4.0=py_0
- markupsafe=1.1.0=py36h470a237_0
- nbformat=4.4.0=py_1
- openssl=1.0.2p=h470a237_1
- plotly=3.4.1=py_0
- pycparser=2.19=py_0
- pyopenssl=18.0.0=py36_1000
- pyrsistent=0.14.7=py36h470a237_0
- pysocks=1.6.8=py36_1002
- pytz=2018.7=py_0
- requests=2.20.1=py36_1000
- retrying=1.3.3=py_2
- six=1.11.0=py36_1001
- traitlets=4.3.2=py36_1000
- urllib3=1.23=py36_1001
- werkzeug=0.14.1=py_0
- blas=1.0=mkl
- cairo=1.14.12=hc4e6be7_4
- click-plugins=1.0.4=py36_0
- cligj=0.5.0=py36_0
- cryptography=2.3.1=py36hdbc3d79_0
- curl=7.61.1=ha441bb4_0
- cycler=0.10.0=py36hfc81398_0
- descartes=1.1.0=py36_0
- expat=2.2.6=h0a44026_0
- fiona=1.7.12=py36h0dff353_0
- fontconfig=2.13.0=h5d5b041_1
- freetype=2.9.1=hb4e5f40_0
- freexl=1.0.5=h1de35cc_0
- gdal=2.2.4=py36h6440ff4_1
- geopandas=0.3.0=py36_0
- geos=3.6.2=h5470d99_2
- gettext=0.19.8.1=h15daf44_3
- giflib=5.1.4=h1de35cc_1
- glib=2.56.2=hd9629dc_0
- gunicorn=19.9.0=py36_0
- hdf4=4.2.13=h39711bb_2
- hdf5=1.10.2=hfa1e0ec_1
- icu=58.2=h4b95b61_1
- intel-openmp=2019.1=144
- jpeg=9b=he5867d9_2
- json-c=0.13.1=h3efe00b_0
- kealib=1.4.7=h40e48e4_6
- kiwisolver=1.0.1=py36h0a44026_0
- krb5=1.16.1=h24a3359_6
- libboost=1.67.0=hebc422b_4
- libcurl=7.61.1=hf30b1f0_0
- libcxx=4.0.1=hcfea43d_1
- libcxxabi=4.0.1=hcfea43d_1
- libdap4=3.19.1=h3d3e54a_0
- libedit=3.1.20170329=hb402a30_2
- libffi=3.2.1=h475c297_4
- libgdal=2.2.4=h7b1ea53_1
- libgfortran=3.0.1=h93005f0_2
- libiconv=1.15=hdd342a3_7
- libkml=1.3.0=hbe12b63_4
- libnetcdf=4.6.1=h4e6abe9_2
- libpng=1.6.35=ha441bb4_0
- libpq=10.5=hf30b1f0_0
- libspatialindex=1.8.5=h2c08c6b_2
- libspatialite=4.3.0a=ha12ebda_19
- libssh2=1.8.0=h322a93b_4
- libtiff=4.0.9=hcb84e12_2
- libuuid=1.0.3=h6bb4b03_2
- libxml2=2.9.8=hab757c2_1
- matplotlib=3.0.1=py36h54f8f79_0
- mkl=2018.0.3=1
- mkl_fft=1.0.6=py36hb8a8100_0
- mkl_random=1.0.1=py36h5d10147_1
- munch=2.3.2=py36_0
- ncurses=6.1=h0a44026_0
- numpy=1.15.4=py36h6a91979_0
- numpy-base=1.15.4=py36h8a80b8c_0
- openjpeg=2.3.0=hb95cd4c_1
- pandas=0.23.4=py36h6440ff4_0
- pcre=8.42=h378b8a2_0
- pip=18.1=py36_0
- pixman=0.34.0=hca0a616_3
- poppler=0.65.0=ha097c24_1
- poppler-data=0.4.9=0
- proj4=5.0.1=h1de35cc_0
- psycopg2=2.7.5=py36hdbc3d79_0
- pyparsing=2.3.0=py36_0
- pyproj=1.9.5.1=py36h833a5d7_1
- pysal=1.14.4.post1=py36_1
- python=3.6.6=hc167b69_0
- python-dateutil=2.7.5=py36_0
- readline=7.0=h1de35cc_5
- rtree=0.8.3=py36_0
- scipy=1.1.0=py36h28f7352_1
- setuptools=40.6.2=py36_0
- shapely=1.6.4=py36h20de77a_0
- sqlalchemy=1.2.14=py36h1de35cc_0
- sqlite=3.25.3=ha441bb4_0
- tk=8.6.8=ha441bb4_0
- tornado=5.1.1=py36h1de35cc_0
- wheel=0.32.3=py36_0
- xerces-c=3.2.2=h44e365a_0
- xz=5.2.4=h1de35cc_4
- zlib=1.2.11=h1de35cc_3
- pip:
- dash-table==3.1.6
prefix: /Applications/anaconda3/envs/dashpilot
My requirements.txt file calls for:
asn1crypto==0.24.0
attrs==18.2.0
certifi==2018.10.15
cffi==1.11.5
chardet==3.0.4
Click==7.0
click-plugins==1.0.4
cligj==0.5.0
cryptography==2.3.1
cryptography-vectors==2.3.1
cycler==0.10.0
dash==0.30.0
dash-core-components==0.38.0
dash-html-components==0.13.2
dash-renderer==0.15.0
dash-table==3.1.6
decorator==4.3.0
descartes==1.1.0
Fiona==1.7.12
Flask==1.0.2
Flask-Compress==1.4.0
GDAL==2.2.4
geojson==2.4.1
geopandas==0.3.0
gunicorn==19.7.1
idna==2.7
ipython-genutils==0.2.0
itsdangerous==1.1.0
Jinja2==2.10
jsonschema==3.0.0a3
jupyter-core==4.4.0
kiwisolver==1.0.1
MarkupSafe==1.1.0
matplotlib==3.0.1
mkl-fft==1.0.6
mkl-random==1.0.1
munch==2.3.2
nbformat==4.4.0
numpy==1.15.4
pandas==0.23.4
plotly==3.4.1
psycopg2==2.7.5
pycparser==2.19
pyOpenSSL==18.0.0
pyparsing==2.3.0
pyproj==1.9.5.1
pyrsistent==0.14.7
PySAL==1.14.4.post1
PySocks==1.6.8
python-dateutil==2.7.5
pytz==2018.7
requests==2.20.1
retrying==1.3.3
Rtree==0.8.3
scipy==1.1.0
Shapely==1.6.4.post1
six==1.11.0
SQLAlchemy==1.2.14
tornado==5.1.1
traitlets==4.3.2
urllib3==1.23
Werkzeug==0.14.1
I've tried all sorts of combinations of buildpacks (heroku's python build pack and the gdal buildpack recommended in heroku's docs). The app works when running 'heroku local' on my mac when solely using the recommended gdal buildpack. I'm new to the the deployment process, so unclear where I might be going astray.
Will add that my config vars are currently these:
BUILD_WITH_GEO_LIBRARIES=1
GDAL_LIBRARY_PATH=os.environ.get('GDAL_LIBRARY_PATH')
GEOS_LIBRARY_PATH=os.environ.get('GEOS_LIBRARY_PATH')
WEB_CONCURRENCY=3

Where's my time going mysteriously?

I have a ruby script that uses rsolr rubygem to generate XMLs and POST them to Apache Solr (javadoc Update Command) surfaced by Jetty Server. My script logs certain time using the following code
405 unless docs.empty?
406 begin
407 log.info("Adding to solr")
408 response = solr.add(docs)
409 log.info("#{(id_2*100.0)/last_id}% Done")
410 if response['responseHeader']['status'] != 0
411 log.fatal("Document ids not sent")
412 #log.fatal(Solr::Request::AddDocument.new(docs_single).to_s)
413 log.close
414 exit
415 end
416 log.info("#{Time.now.to_f - starttime}s to feed Solr. #{id_1} to #{id_2}")
417 rescue Exception => e
418 log.fatal("Document ids not sent => ")
419 #log.fatal(Solr::Request::AddDocument.new(docs_single).to_s)
420 #log.fatal(docs)
421 log.close
422 exit
423 end
The log generated goes like
I, [2011-10-09T15:03:42.617048 #30092] INFO -- : Executing - SELECT * FROM solr_feeddata_2 WHERE id >= 5879999 AND id < 5881999
I, [2011-10-09T15:03:44.086661 #30092] INFO -- : External Data1 fetch time: 1.45462989807129
I, [2011-10-09T15:03:44.109514 #30092] INFO -- : External Data2 fetch time: 0.0226790904998779
I, [2011-10-09T15:03:44.109611 #30092] INFO -- : 1.49255704879761s to fetch details from database. 5879999 to 5881999
I, [2011-10-09T15:03:44.109702 #30092] INFO -- : Adding data1, data2, building docs
I, [2011-10-09T15:03:45.912603 #30092] INFO -- : 3.29554414749146s to build documents. 5879999 to 5881999
I, [2011-10-09T15:03:45.912730 #30092] INFO -- : Adding to solr
I, [2011-10-09T15:04:24.797620 #30092] INFO -- : 61.180855194502% Done
I, [2011-10-09T15:04:24.797744 #30092] INFO -- : 42.180694103241s to feed Solr. 5879999 to 5881999
According to this log, Solr took (42.18 - 3.29 - 1.49 - 2) 35.4s to respond. (See below comment)
At the same time my Solr log for this particular update goes like
INFO: {add=[5879999, 5880000, 5880001, 5880002, 5880003, 5880004, 5880005, 5880007, ... (1468 adds)]} 0 5780
Oct 9, 2011 3:04:24 PM org.apache.solr.core.SolrCore execute
INFO: [core0] webapp=/solr path=/update params={wt=ruby} status=0 QTime=5780
Oct 9, 2011 3:04:42 PM org.apache.solr.update.processor.LogUpdateProcessor finish
This clearly shows that Solr took 5.78s to add the docs, initiate response send and closed the log updater.
Both the services run on same machine, inside the network, and their ping summary is
rtt min/avg/max/mdev = 0.008/0.010/0.022/0.006 ms
This pattern is clearly visible for every batch data processed. Despite my sincere efforts to get this mystery out, I am not able to get the reason for this behavior.
My Solr mergeFactor is 10, autoCommit is off.

Resources