I have a problem and I believe it may be due to my installation with HTTPS, I came to this conclusion simply because when installing over HTTP this does not happen, ie the problem is certainly due to the lack of any specific configuration in my docker-compose or something like that.
Below is the my docker-compose.yml file, error print and also the Stack Trace that the screen itself shows.
version: '3'
services:
# MongoDB: https://hub.docker.com/_/mongo/
mongo:
image: mongo:3
networks:
- graylog
# Elasticsearch: https://www.elastic.co/guide/en/elasticsearch/reference/6.x/docker.html
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch-oss:6.8.2
volumes:
- es_data:/usr/share/elasticsearch/data
environment:
- http.host=0.0.0.0
- transport.host=localhost
- network.host=0.0.0.0
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
ulimits:
memlock:
soft: -1
hard: -1
deploy:
resources:
limits:
memory: 1g
networks:
- graylog
# Graylog: https://hub.docker.com/r/graylog/graylog/
graylog:
image: csilveir/graylog
volumes:
- /home/ubuntu/graylog:/home/ubuntu/graylog
- /home/ubuntu/graylog/plugins/graylog-plugin-slack-notification-1.0.4.jar:/usr/share/graylog/plugin/graylog-plugin-slack-notification-1.0.4.jar
environment:
# (must be at least 16 characters)!
- GRAYLOG_ROOT_TIMEZONE=America/Sao_Paulo
- GRAYLOG_ROOT_EMAIL=dev#dragonvc.com.br
- GRAYLOG_IS_MASTER=true
# HTTPS
- GRAYLOG_HTTP_ENABLE_TLS=true
- GRAYLOG_HTTP_TLS_CERT_FILE=/home/ubuntu/graylog/graylog.crt
- GRAYLOG_HTTP_TLS_KEY_FILE=/home/ubuntu/graylog/graylog.key
- GRAYLOG_HTTP_PUBLISH_URI=https://graylog.dragonvc.com.br/
networks:
- graylog
depends_on:
- mongo
- elasticsearch
ports:
#- "80:9000"
- 80:443
- 443:9000
- 514:514
- 514:514/udp
- 1514:1514/udp
- 5044:5044
- 9000:9000
- 9350:9350
- 12200-12300:12200-12300
- 12200-12300:12200-12300/udp
- 12900:12900
networks:
graylog:
driver: bridge
# Volumes for persisting data, see https://docs.docker.com/engine/admin/volumes/volumes/
volumes:
mongo_data:
driver: local
es_data:
driver: local
graylog_journal:
driver: local
Cannot set property 'data' of undefined
Stack Trace:
TypeError: Cannot set property '__data__' of undefined
at Array.ye.select (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:83:227338)
at Array.Z.insert (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:83:224227)
at Array.ye.insert (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:83:227450)
at SVGGElement.<anonymous> (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:83:350536)
at https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:83:226023
at me (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:83:222388)
at Array.Z.each (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:83:225997)
at Array.l (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:83:350305)
at Array.Z.call (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:83:226096)
at r._drawAxis (https://graylog.dragonvc.com.br/assets/vendors~LoggedInPage.af2f821c666e2573f8ad.js:46:42162)
at render (https://graylog.dragonvc.com.br/assets/vendors~LoggedInPage.af2f821c666e2573f8ad.js:46:41677)
at https://graylog.dragonvc.com.br/assets/vendors~LoggedInPage.af2f821c666e2573f8ad.js:46:40953
at https://graylog.dragonvc.com.br/assets/vendors~LoggedInPage.af2f821c666e2573f8ad.js:46:23606
at Array.forEach (<anonymous>)
at e.Graph.render (https://graylog.dragonvc.com.br/assets/vendors~LoggedInPage.af2f821c666e2573f8ad.js:46:23586)
at Object.drawResultGraph (https://graylog.dragonvc.com.br/assets/LoggedInPage.af2f821c666e2573f8ad.js:1:203535)
at t._renderHistogram (https://graylog.dragonvc.com.br/assets/LoggedInPage.af2f821c666e2573f8ad.js:1:218539)
at t.componentDidMount (https://graylog.dragonvc.com.br/assets/LoggedInPage.af2f821c666e2573f8ad.js:1:217792)
at t.componentDidMount (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:73:88989)
at Ro (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:18:82395)
at Xo (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:18:85070)
at https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:18:98277
at Object.exports.unstable_runWithPriority (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:35:3284)
at Os (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:18:98212)
at Ys (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:18:97988)
at Ss (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:18:97333)
at Ls (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:18:96354)
at Zo (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:18:95228)
at Object.enqueueSetState (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:18:44755)
at t.b.setState (https://graylog.dragonvc.com.br/assets/vendor.91c91d4a31d54d96392a.js:26:1665)
at https://graylog.dragonvc.com.br/assets/90afab18-75.af2f821c666e2573f8ad.js:1:2875
at l (https://graylog.dragonvc.com.br/assets/builtins.af2f821c666e2573f8ad.js:104:88608)
at O._settlePromiseFromHandler (https://graylog.dragonvc.com.br/assets/builtins.af2f821c666e2573f8ad.js:104:61890)
at O._settlePromise (https://graylog.dragonvc.com.br/assets/builtins.af2f821c666e2573f8ad.js:104:62690)
at O._settlePromise0 (https://graylog.dragonvc.com.br/assets/builtins.af2f821c666e2573f8ad.js:104:63389)
at O._settlePromises (https://graylog.dragonvc.com.br/assets/builtins.af2f821c666e2573f8ad.js:104:64716)
at https://graylog.dragonvc.com.br/assets/builtins.af2f821c666e2573f8ad.js:104:18338
Component Stack:
in LegacyHistogram
in div
in t
in div
in t
in t
in t
in SearchPage
in Unknown
in n
in div
in t
in div
in t
in div
in AppWithSearchBar
in div
in t
in t
in withRouter(t)
in div
in App
in RouterContext
in Router
in h
in t
in n
in AppFacade
This error occurs in a .js file, as shown in print:
I had this problem and resolved it by cleaning browser cache.
Related
I'm running into an issue where my conda env isn't being created from a .yaml file. I think it's failing because this yaml file was made on centos6, but I have since upgraded to a new centos7 AMI on AWS.
conda env create --name html_ribo_report --file=html_ribo_report.yaml
No errors, seems to complete after finding conflicts (takes about an hour to run). I can paste the output if needed.
Here are the contents of html_ribo_report.yaml:
name: eclipsebio
channels:
- https://repo.continuum.io/pkgs/free
- defaults
- conda-forge
- bioconda
dependencies:
- _libgcc_mutex=0.1=main
- _r-mutex=1.0.0=anacondar_1
- aioeasywebdav=2.2.0=py36_0
- aiohttp=2.2.5=py36h3a1b670_0
- altair=3.0.1=py36_0
- appdirs=1.4.3=py36_0
- argcomplete=1.8.2=py36_0
- argh=0.26.2=py36_1
- asn1crypto=0.22.0=py36h265ca7c_1
- async-timeout=2.0.0=py36h92a791d_0
- backports=1.0=py36hfa02d7e_1
- backports.functools_lru_cache=1.4=py36_1
- bamtools=2.4.0=3
- bcftools=1.6=0
- bedtools=2.26.0=0
- bioawk=1.0=1
- bioconductor-annotate=1.56.0=r3.4.1_0
- bioconductor-annotationdbi=1.40.0=r3.4.1_0
- bioconductor-biobase=2.38.0=r3.4.1_0
- bioconductor-biocgenerics=0.24.0=r3.4.1_0
- bioconductor-biocinstaller=1.28.0=r3.4.1_0
- bioconductor-biocparallel=1.12.0=r3.4.1_0
- bioconductor-biocstyle=2.6.0=r3.4.1_0
- bioconductor-biomart=2.34.0=r3.4.1_0
- bioconductor-biostrings=2.46.0=r3.4.1_0
- bioconductor-bsgenome=1.46.0=r3.4.1_0
- bioconductor-bumphunter=1.20.0=r3.4.1_0
- bioconductor-delayedarray=0.4.1=r3.4.1_0
- bioconductor-derfinder=1.12.0=r3.4.1_0
- bioconductor-derfinderhelper=1.12.0=r3.4.1_0
- bioconductor-edger=3.20.1=r3.4.1_0
- bioconductor-genefilter=1.60.0=r3.4.1_0
- bioconductor-geneplotter=1.56.0=r3.4.1_0
- bioconductor-genomeinfodb=1.14.0=r3.4.1_0
- bioconductor-genomeinfodbdata=1.1.0=r341_0
- bioconductor-genomicalignments=1.14.0=r3.4.1_0
- bioconductor-genomicfeatures=1.26.4=r3.4.1_0
- bioconductor-genomicfiles=1.14.0=r3.4.1_0
- bioconductor-genomicranges=1.30.0=r3.4.1_0
- bioconductor-glimma=1.6.0=r3.4.1_0
- bioconductor-iranges=2.12.0=r3.4.1_0
- bioconductor-limma=3.34.1=r3.4.1_0
- bioconductor-qvalue=2.10.0=r3.4.1_0
- bioconductor-rsamtools=1.30.0=r3.4.1_0
- bioconductor-rtracklayer=1.38.0=r3.4.1_0
- bioconductor-s4vectors=0.16.0=r3.4.1_0
- bioconductor-summarizedexperiment=1.8.0=r3.4.1_0
- bioconductor-tximport=1.6.0=r3.4.1_0
- bioconductor-variantannotation=1.20.3=r3.4.1_0
- bioconductor-xvector=0.18.0=r3.4.1_0
- bioconductor-zlibbioc=1.24.0=r3.4.1_0
- biopython=1.69=np113py36_0
- blas=1.0=mkl
- bleach=2.1.1=py36hd521086_0
- boto=2.48.0=py36h6e4cd66_1
- boto3=1.4.7=py36h4cc92d5_0
- botocore=1.7.20=py36h085fff1_0
- bowtie=1.2.3=py36hc9558a2_0
- bowtie2=2.3.5=py36he860b03_0
- bzip2=1.0.6=h0376d23_1
- ca-certificates=2017.08.26=h1d4fec5_0
- cairo=1.14.8=0
- certifi=2016.9.26=py36_0
- cffi=1.10.0=py36had8d393_1
- chardet=3.0.4=py36h0f667ec_1
- click=6.7=py36h5253387_0
- configargparse=0.12.0=py36_0
- cookies=2.2.1=py36_0
- cryptography=2.0.3=py36ha225213_1
- curl=7.52.1=0
- cutadapt=2.7=py36h516909a_0
- cycler=0.10.0=py36h93f1223_0
- cython=0.28.5=py36hf484d3e_0
- dbus=1.10.20=0
- decorator=4.1.2=py36hd076ac8_0
- dicttoxml=1.7.4=py36_0
- dnaio=0.4.1=py36h516909a_0
- docopt=0.6.2=py36_0
- docutils=0.14=py36hb0f60f5_0
- dropbox=7.3.1=py36_0
- entrypoints=0.2.3=py36h1aec115_2
- expat=2.2.4=h6ea4f2b_2
- fastqc=0.11.6=2
- filechunkio=1.8=py36_1
- flask=0.12.2=py36hb24657c_0
- font-ttf-dejavu-sans-mono=2.37=h6964260_0
- font-ttf-inconsolata=2.001=hcb22688_0
- font-ttf-source-code-pro=2.030=h7457263_0
- fontconfig=2.12.1=3
- freetype=2.5.5=2
- ftputil=3.3.1=py36_0
- future=0.16.0=py36_1
- gettext=0.19.8.1=hd7bead4_3
- gffutils=0.9=py36_0
- glib=2.50.2=1
- gmp=6.1.2=h6c8ec71_1
- graphite2=1.3.10=hc526e54_0
- gsl=2.2.1=h0c605f7_3
- gst-plugins-base=1.8.0=0
- gstreamer=1.8.0=0
- harfbuzz=0.9.39=2
- homer=4.10=pl526hc9558a2_0
- html5lib=0.999999999=py36h2cfc398_0
- htseq=0.9.1=py36_0
- htslib=1.6=0
- icu=54.1=0
- idna=2.6=py36h82fb2a8_1
- intel-openmp=2018.0.0=h15fc484_7
- ipykernel=4.6.1=py36hbf841aa_0
- ipython=6.2.1=py36h88c514a_1
- ipython_genutils=0.2.0=py36hb52b0d5_0
- itsdangerous=0.24=py36h93cc618_1
- jbig=2.1=hdba287a_0
- jedi=0.11.0=py36hf290c5b_0
- jinja2=2.9.6=py36h489bce4_1
- jmespath=0.9.3=py36hd3948f9_0
- jpeg=9b=h024ee3a_2
- jsonschema=2.6.0=py36h006f8b5_0
- jupyter_client=5.1.0=py36h614e9ea_0
- jupyter_core=4.4.0=py36h7c827e3_0
- krb5=1.14.2=hcdc1b81_6
- libedit=3.1.20170329=0
- libffi=3.2.1=hd88cf55_4
- libgcc=7.2.0=h69d50b8_2
- libgcc-ng=8.2.0=hdf63c60_1
- libgfortran-ng=7.2.0=h9f7466a_2
- libiconv=1.14=0
- libpng=1.6.36=hbc83047_0
- libsodium=1.0.15=hf101ebd_0
- libssh2=1.8.0=h2d05a93_3
- libstdcxx-ng=9.1.0=hdf63c60_0
- libtiff=4.0.9=h28f6b97_0
- libuuid=2.32.1=h14c3975_1000
- libxcb=1.12=h84ff03f_3
- libxml2=2.9.4=0
- markupsafe=1.0=py36hd9260cd_1
- matplotlib=2.0.2=np113py36_0
- matplotlib-venn=0.11.5=py_1
- mistune=0.8.1=py36h3d5977c_0
- mkl=2018.0.0=hb491cac_4
- mock=2.0.0=py36h3c5bf6c_0
- moto=1.1.1=py36_0
- multidict=3.2.0=py36h97a4c74_0
- mysql=5.5.24=0
- mysql-connector-c=6.1.6=0
- nbconvert=5.3.1=py36hb41ffb7_0
- nbformat=4.4.0=py36h31c9010_0
- ncurses=5.9=10
- notebook=5.2.2=py36h40a37e6_0
- numpy=1.13.3=py36ha12f23b_0
- openjdk=8.0.121=1
- openssl=1.0.2r=h7b6447c_0
- pandas=0.21.0=py36h78bd809_1
- pandoc=1.19.2.1=hea2e7c5_1
- pandocfilters=1.4.2=py36ha6701b7_1
- pango=1.40.3=1
- paramiko=2.1.2=py36_0
- parso=0.1.0=py36h2b61f4d_0
- patsy=0.4.1=py36ha3be15e_0
- pbr=3.1.1=py36hb5f6b33_0
- pcre=8.39=1
- perl=5.26.2=h14c3975_0
- perl-common-sense=3.74=pl526_2
- perl-json=4.02=pl526_0
- perl-json-xs=2.34=pl526h6bb024c_3
- perl-types-serialiser=1.0=pl526_2
- pexpect=4.3.0=py36h673ed17_0
- pickleshare=0.7.4=py36h63277f8_0
- pigz=2.3=0
- pip=9.0.1=py36h6c6f9ce_4
- pixman=0.34.0=hceecf20_3
- prompt_toolkit=1.0.15=py36h17d85b1_0
- psutil=5.4.0=py36h84c53db_0
- ptyprocess=0.5.2=py36h69acd42_0
- pyaml=17.10.0=py_0
- pyasn1=0.3.7=py36h0f28794_0
- pybedtools=0.7.10=py36_1
- pycparser=2.18=py36hf9f622e_1
- pyfaidx=0.5.0=py36_0
- pygments=2.2.0=py36h0d3125c_0
- pyopenssl=17.2.0=py36h5cc804b_0
- pyparsing=2.2.0=py36hee85983_1
- pyqt=5.6.0=py36_2
- pysam=0.11.2.2=py36_1
- pysftp=0.2.9=py36_0
- pysocks=1.6.7=py36hd97a5b1_1
- python=3.6.3=h1284df2_4
- python-dateutil=2.6.1=py36h88d3b88_1
- pytz=2017.2=py36hc2ccc2a_1
- pyyaml=3.12=py36hafb9ca4_1
- pyzmq=16.0.3=py36he2533c7_0
- qt=5.6.2=5
- qualimap=2.2.2a=1
- r-acepack=1.4.1=r3.4.1_0
- r-assertthat=0.2.0=r3.4.1_0
- r-backports=1.1.0=r3.4.1_0
- r-base=3.4.1=0
- r-base64enc=0.1_3=r3.4.1_0
- r-bh=1.62.0_1=r3.4.1_0
- r-bitops=1.0_6=r3.4.1_2
- r-bookdown=0.4=r3.4.1_0
- r-catools=1.17.1=r3.4.1_2
- r-checkmate=1.8.2=r3.4.1_0
- r-cluster=2.0.6=r3.4.1_0
- r-codetools=0.2_15=r3.4.1_0
- r-colorspace=1.3_2=r3.4.1_0
- r-data.table=1.10.4=r3.4.1_0
- r-dbi=0.6_1=r3.4.1_0
- r-dichromat=2.0_0=r3.4.1_2
- r-digest=0.6.12=r3.4.1_0
- r-dorng=1.6.6=r3.4.1_0
- r-evaluate=0.10=r3.4.1_0
- r-foreach=1.4.3=r3.4.1_0
- r-foreign=0.8_68=r3.4.1_0
- r-formula=1.2_1=r3.4.1_0
- r-futile.logger=1.4.3=r3.4.1_0
- r-futile.options=1.0.0=r3.4.1_0
- r-gdata=2.18.0=r3.4.1_0
- r-getopt=1.20.0=r3.4.1_0
- r-ggplot2=2.2.1=r3.4.1_0
- r-gplots=3.0.1=r3.4.1_0
- r-gridextra=2.2.1=r3.4.1_0
- r-gtable=0.2.0=r3.4.1_0
- r-gtools=3.5.0=r3.4.1_0
- r-highr=0.6=r3.4.1_0
- r-hmisc=4.0_3=r3.4.1_0
- r-htmltable=1.9=r3.4.1_0
- r-htmltools=0.3.6=r3.4.1_0
- r-htmlwidgets=0.8=r3.4.1_1
- r-iterators=1.0.8=r3.4.1_0
- r-jsonlite=1.5=r3.4.1_0
- r-kernsmooth=2.23_15=r3.4.1_0
- r-knitr=1.16=r3.4.1_0
- r-knitrbootstrap=1.0.1=r3.4.1_0
- r-labeling=0.3=r3.4.1_2
- r-lambda.r=1.1.9=r3.4.1_0
- r-lattice=0.20_35=r3.4.1_0
- r-latticeextra=0.6_28=r3.4.1_0
- r-lazyeval=0.2.0=r3.4.1_0
- r-locfit=1.5_9.1=r3.4.1_0
- r-magrittr=1.5=r3.4.1_2
- r-markdown=0.8=r3.4.1_0
- r-mass=7.3_47=r3.4.1_0
- r-matrix=1.2_10=r3.4.1_0
- r-matrixstats=0.52.2=r3.4.1_0
- r-memoise=1.1.0=r3.4.1_0
- r-mime=0.5=r3.4.1_0
- r-munsell=0.4.3=r3.4.1_0
- r-nnet=7.3_12=r3.4.1_0
- r-pkgmaker=0.22=r3.4.1_0
- r-plogr=0.1_1=r3.4.1_0
- r-plyr=1.8.4=r3.4.1_0
- r-prettyunits=1.0.2=r3.4.1_0
- r-progress=1.1.2=r3.4.1_0
- r-r6=2.2.1=r3.4.1_0
- r-rcolorbrewer=1.1_2=r3.4.1_3
- r-rcpp=0.12.11=r3.4.1_0
- r-rcpparmadillo=0.7.900.2.0=r3.4.1_0
- r-rcurl=1.95_4.8=r3.4.1_0
- r-registry=0.3=r3.4.1_0
- r-reshape2=1.4.2=r3.4.1_0
- r-rjson=0.2.15=r3.4.1_0
- r-rlang=0.1.1=r3.4.1_0
- r-rmarkdown=1.5=r3.4.1_0
- r-rngtools=1.2.4=r3.4.1_0
- r-rpart=4.1_11=r3.4.1_0
- r-rprojroot=1.2=r3.4.1_0
- r-rsqlite=1.1_2=r3.4.1_0
- r-scales=0.4.1=r3.4.1_0
- r-snow=0.4_2=r3.4.1_0
- r-stringi=1.1.5=r3.4.1_0
- r-stringr=1.2.0=r3.4.1_0
- r-survival=2.41_3=r3.4.1_0
- r-tibble=1.3.3=r3.4.1_0
- r-viridis=0.4.0=r3.4.1_0
- r-viridislite=0.2.0=r3.4.1_0
- r-xml=3.98_1.7=r3.4.1_0
- r-xtable=1.8_2=r3.4.1_0
- r-yaml=2.1.14=r3.4.1_0
- ratelimiter=1.2.0=py36_0
- readline=7.0=hb321a52_4
- regex=2017.09.23=py36h2527052_0
- requests=2.14.2=py36_0
- s3transfer=0.1.10=py36h0257dcc_1
- samtools=1.5=1
- scikit-learn=0.19.1=py36h7aa7ec6_0
- scipy=1.0.0=py36hbf646e7_0
- seaborn=0.8.0=py36h197244f_0
- seqtk=1.3=hed695b0_2
- setuptools=36.5.0=py36he42e2e1_0
- setuptools-git=1.2=py36_0
- simplegeneric=0.8.1=py36h2cb9092_0
- simplejson=3.11.1=py36_0
- sip=4.18.1=py36h51ed4ed_2
- six=1.11.0=py36h372c433_1
- snakemake=4.3.0=py36_0
- sqlite=3.20.1=hb898158_2
- star=2.6.0c=1
- statsmodels=0.8.0=py36h8533d0b_0
- system=5.8=2
- tbb=2019.8=hc9558a2_0
- terminado=0.6=py36ha25a19f_0
- testpath=0.3.1=py36h8cadb63_0
- tk=8.6.7=hc745277_3
- toolz=0.9.0=py36_0
- tornado=4.5.2=py36h1283b2a_0
- traitlets=4.3.2=py36h674d592_0
- ucsc-bedclip=377=h199ee4e_0
- ucsc-bedgraphtobigwig=332=0
- ucsc-bedtobigbed=332=0
- ucsc-twobittofa=357=1
- umi_tools=0.5.1=py36_0
- urllib3=1.22=py36hbe7ace6_0
- wcwidth=0.1.7=py36hdf4376a_0
- webencodings=0.5.1=py36h800622e_1
- werkzeug=0.12.2=py36hc703753_0
- wget=1.19.1=he4ec0ba_0
- wheel=0.29.0=py36he7f4e38_1
- wrapt=1.10.11=py36h28b7045_0
- xmltodict=0.11.0=py36_0
- xopen=0.8.4=py36_0
- xz=5.2.3=h55aa19d_2
- yaml=0.1.7=h014fa73_2
- yarl=0.13.0=py36h21e4b6b_0
- zeromq=4.2.2=hbedb6e5_2
- zlib=1.2.11=ha838bed_2
- pip:
- pillow==8.3.2
prefix: /home/ec2-user/anaconda3/envs/eclipsebio
At a glance there could be many issues with this environment. Some of the most obvious things are:
You may need channel_priority set to flexible, i.e.,
conda config --set channel_priority flexible
Otherwise, higher priority channels might be masking packages in the other channels.
The channel order is substandard. The free channel should always be last, and Bioconda has very specific channel order requirements to ensure their packages work. Change the order to:
channels:
- conda-forge
- bioconda
- defaults
- free
Switch to Mamba.
conda install -n base conda-forge::mamba
mamba env create -n html_ribo_report -f html_ribo_report.yaml
Otherwise, I'd consider breaking the environment up into more modular (i.e., manageable) components that fit to different tasks. Personally, I avoid mixing Python and R in the same environment, and keep Snakemake in its own dedicated environment.
When I type conda env create -f environment.yml
I constantly get
Collecting package metadata (repodata.json): done Solving environment: failed
ResolvePackageNotFound:
- tk==8.6.8=hbc83047_0
- zlib==1.2.11=h7b6447c_3
- av==8.0.2=py37h06622b3_4
- lame==3.100=h7f98852_1001
- xz==5.2.4=h14c3975_4
- mkl_random==1.0.2=py37hd81dba3_0
- x264==1!152.20180806=h14c3975_0
- numpy-base==1.16.4=py37hde5b4d6_0
- certifi==2020.12.5=py37h06a4308_0
- _openmp_mutex==4.5=1_llvm
- llvm-openmp==11.0.0=hfc4b9b4_1
- freetype==2.9.1=h8a8886c_1
- scikit-learn==0.22.1=py37hd81dba3_0
- libgfortran-ng==7.3.0=hdf63c60_0
- readline==7.0=h7b6447c_5
- mkl_fft==1.0.12=py37ha843d7b_0
- libpng==1.6.37=hbc83047_0
- libedit==3.1.20181209=hc058e9b_0
- libffi==3.2.1=hd88cf55_4
- nettle==3.6=he412f7d_0
- gnutls==3.6.13=h85f3911_1
- python==3.7.3=h0371630_0
- gmp==6.2.1=h58526e2_0
- _libgcc_mutex==0.1=conda_forge
- libgcc-ng==9.3.0=h5dbcf3e_17
- mkl-service==2.3.0=py37he904b0f_0
- ffmpeg==4.3.1=h3215721_1
- openh264==2.1.1=h8b12597_0
- mkl==2019.4=243
- numpy==1.16.4=py37h7e9f1db_0
- ca-certificates==2020.12.8=h06a4308_0
- libiconv==1.16=h516909a_0
- intel-openmp==2019.4=243
- libstdcxx-ng==9.1.0=hdf63c60_0
- zstd==1.3.7=h0b5b093_0
- ncurses==6.1=he6710b0_1
- jpeg==9b=h024ee3a_2
- openssl==1.1.1i=h27cfd23_0
- bzip2==1.0.8=h7f98852_4
- sqlite==3.28.0=h7b6447c_0
- libtiff==4.0.10=h2733197_2
What should I do?
My yml file is:
name: StyleFlow
channels:
- anaconda
- defaults
- conda-forge
dependencies:
- _libgcc_mutex=0.1=conda_forge
- _openmp_mutex=4.5=1_llvm
- av=8.0.2=py37h06622b3_4
- blas=1.0=mkl
- bzip2=1.0.8=h7f98852_4
- ca-certificates=2020.12.8=h06a4308_0
- certifi=2020.12.5=py37h06a4308_0
- ffmpeg=4.3.1=h3215721_1
- freetype=2.9.1=h8a8886c_1
- gmp=6.2.1=h58526e2_0
- gnutls=3.6.13=h85f3911_1
- intel-openmp=2019.4=243
- joblib=0.14.1=py_0
- jpeg=9b=h024ee3a_2
- lame=3.100=h7f98852_1001
- libedit=3.1.20181209=hc058e9b_0
- libffi=3.2.1=hd88cf55_4
- libgcc-ng=9.3.0=h5dbcf3e_17
- libgfortran-ng=7.3.0=hdf63c60_0
- libiconv=1.16=h516909a_0
- libpng=1.6.37=hbc83047_0
- libstdcxx-ng=9.1.0=hdf63c60_0
- libtiff=4.0.10=h2733197_2
- llvm-openmp=11.0.0=hfc4b9b4_1
- mkl=2019.4=243
- mkl-service=2.3.0=py37he904b0f_0
- mkl_fft=1.0.12=py37ha843d7b_0
- mkl_random=1.0.2=py37hd81dba3_0
- natsort=6.0.0=py_0
- ncurses=6.1=he6710b0_1
- nettle=3.6=he412f7d_0
- numpy=1.16.4=py37h7e9f1db_0
- numpy-base=1.16.4=py37hde5b4d6_0
- olefile=0.46=py37_0
- openh264=2.1.1=h8b12597_0
- openssl=1.1.1i=h27cfd23_0
- pip=19.1.1=py37_0
- python=3.7.3=h0371630_0
- python_abi=3.7=1_cp37m
- readline=7.0=h7b6447c_5
- scikit-learn=0.22.1=py37hd81dba3_0
- setuptools=41.0.1=py37_0
- sqlite=3.28.0=h7b6447c_0
- tk=8.6.8=hbc83047_0
- wheel=0.33.4=py37_0
- x264=1!152.20180806=h14c3975_0
- xz=5.2.4=h14c3975_4
- zlib=1.2.11=h7b6447c_3
- zstd=1.3.7=h0b5b093_0
- pip:
- absl-py==0.7.1
- appdirs==1.4.4
- astor==0.8.0
- astunparse==1.6.3
- attrs==19.1.0
- backcall==0.1.0
- bleach==3.1.0
- cachetools==4.1.0
- cffi==1.12.3
- chardet==3.0.4
- cloudpickle==1.2.1
- cycler==0.10.0
- cytoolz==0.9.0.1
- dask==2.1.0
- decorator==4.4.0
- defusedxml==0.6.0
- deprecated==1.2.6
- dill==0.2.9
- dlib==19.21.0
- dominate==2.3.5
- easydict==1.9
- entrypoints==0.3
- gast==0.2.2
- google-auth==1.14.3
- google-auth-oauthlib==0.4.1
- google-pasta==0.2.0
- grpcio==1.22.0
- h5py==2.10.0
- helpdev==0.6.10
- idna==2.8
- imageio==2.5.0
- importlib-metadata==0.18
- imutils==0.5.3
- ipykernel==5.1.1
- ipython==7.6.0
- ipython-genutils==0.2.0
- ipywidgets==7.4.2
- jedi==0.13.3
- jinja2==2.10.1
- jsonschema==3.0.1
- jupyter==1.0.0
- jupyter-client==5.2.4
- jupyter-console==6.0.0
- jupyter-core==4.5.0
- keras==2.2.4
- keras-applications==1.0.8
- keras-preprocessing==1.1.0
- kiwisolver==1.1.0
- mako==1.1.2
- markdown==3.1.1
- markupsafe==1.1.1
- matplotlib==3.1.0
- mistune==0.8.4
- nbconvert==5.5.0
- nbformat==4.4.0
- networkx==2.3
- notebook==5.7.8
- oauthlib==3.1.0
- opencv-python==4.1.0.25
- opt-einsum==3.2.1
- pandocfilters==1.4.2
- parso==0.5.0
- pexpect==4.7.0
- pickleshare==0.7.5
- pillow==6.0.0
- prometheus-client==0.7.1
- prompt-toolkit==2.0.9
- protobuf==3.8.0
- psutil==5.6.3
- ptyprocess==0.6.0
- pyasn1==0.4.8
- pyasn1-modules==0.2.8
- pycparser==2.19
- pycuda==2019.1.2
- pygments==2.4.2
- pyparsing==2.4.0
- pyqt5==5.13.0
- pyqt5-sip==4.19.18
- pyrsistent==0.14.11
- pyside2==5.13.0
- python-dateutil==2.8.0
- pytools==2020.1
- pytz==2019.1
- pywavelets==1.0.3
- pyyaml==5.1.1
- pyzmq==18.0.0
- qdarkgraystyle==1.0.2
- qdarkstyle==2.7
- qtconsole==4.5.1
- requests==2.22.0
- requests-oauthlib==1.3.0
- rsa==4.0
- scikit-image==0.15.0
- scikit-video==1.1.11
- scipy==1.2.1
- send2trash==1.5.0
- shiboken2==5.13.0
- six==1.12.0
- tensorboard==1.15.0
- tensorboard-plugin-wit==1.6.0.post3
- tensorflow-estimator==1.15.1
- tensorflow-gpu==1.15.0
- termcolor==1.1.0
- terminado==0.8.2
- testpath==0.4.2
- toolz==0.9.0
- torch==1.1.0
- torchdiffeq==0.0.1
- torchvision==0.3.0
- tornado==6.0.3
- tqdm==4.32.1
- traitlets==4.3.2
- urllib3==1.25.3
- wcwidth==0.1.7
- webencodings==0.5.1
- werkzeug==0.15.4
- widgetsnbextension==3.4.2
- wrapt==1.11.2
- zipp==0.5.2
Conda does not work well with large environments in which everything pinned to specific versions (in contrast to other ecosystems in which pinning everything is the standard). The result of conda env export, which is what this probably is, here also includes the build numbers, which are almost always too specific (and often platform-specific) for the purpose of installing the right version of the software. It's great for things like reproducibility of scientific work (specific versions and builds of everything need to be known), but not great for installing software (there is plenty of flexibility in versions that should work with any package).
I'd start by removing the build pins (dropping everything after the second = in each line) so that only the versions are pinned. After that, I'd start removing version pins.
Use Ubuntu 18.04 x86 (linux-64) and the environment.yml provided will work.
It fails on MacOS (M1 Silicon).
As has been pointed out in the other reply; exported environment files with explicit build numbers, and as it turns out fixed package version combinations; may not work if this host platform is different.
Exported environments are a great way to get reproducible environments, but the build platform must be the same. I suggest using conda info on the origin host and on the target host to check if they are the same.
I'm trying to install dependencies for tensor flow development. For that I'm using a yml file tfdl_env.yml. Used the conda env create, so that it was supposed to install all the dependencies.
conda env create -f tfdl_env.yml
But there it was shown that the Solving environment: failed and RequiredPackageNotFound.
The yml files used is below.
name: tfdeeplearning
channels:
- defaults
dependencies:
- bleach=1.5.0=py35_0
- certifi=2016.2.28=py35_0
- colorama=0.3.9=py35_0
- cycler=0.10.0=py35_0
- decorator=4.1.2=py35_0
- entrypoints=0.2.3=py35_0
- html5lib=0.9999999=py35_0
- icu=57.1=vc14_0
- ipykernel=4.6.1=py35_0
- ipython=6.1.0=py35_0
- ipython_genutils=0.2.0=py35_0
- ipywidgets=6.0.0=py35_0
- jedi=0.10.2=py35_2
- jinja2=2.9.6=py35_0
- jpeg=9b=vc14_0
- jsonschema=2.6.0=py35_0
- jupyter=1.0.0=py35_3
- jupyter_client=5.1.0=py35_0
- jupyter_console=5.2.0=py35_0
- jupyter_core=4.3.0=py35_0
- libpng=1.6.30=vc14_1
- markupsafe=1.0=py35_0
- matplotlib=2.0.2=np113py35_0
- mistune=0.7.4=py35_0
- mkl=2017.0.3=0
- nbconvert=5.2.1=py35_0
- nbformat=4.4.0=py35_0
- notebook=5.0.0=py35_0
- numpy=1.13.1=py35_0
- openssl=1.0.2l=vc14_0
- pandas=0.20.3=py35_0
- pandocfilters=1.4.2=py35_0
- path.py=10.3.1=py35_0
- pickleshare=0.7.4=py35_0
- pip=9.0.1=py35_1
- prompt_toolkit=1.0.15=py35_0
- pygments=2.2.0=py35_0
- pyparsing=2.2.0=py35_0
- pyqt=5.6.0=py35_2
- python=3.5.4=0
- python-dateutil=2.6.1=py35_0
- pytz=2017.2=py35_0
- pyzmq=16.0.2=py35_0
- qt=5.6.2=vc14_6
- qtconsole=4.3.1=py35_0
- requests=2.14.2=py35_0
- scikit-learn=0.19.0=np113py35_0
- scipy=0.19.1=np113py35_0
- setuptools=36.4.0=py35_1
- simplegeneric=0.8.1=py35_1
- sip=4.18=py35_0
- six=1.10.0=py35_1
- testpath=0.3.1=py35_0
- tk=8.5.18=vc14_0
- tornado=4.5.2=py35_0
- traitlets=4.3.2=py35_0
- vs2015_runtime=14.0.25420=0
- wcwidth=0.1.7=py35_0
- wheel=0.29.0=py35_0
- widgetsnbextension=3.0.2=py35_0
- win_unicode_console=0.5=py35_0
- wincertstore=0.2=py35_0
- zlib=1.2.11=vc14_0
- pip:
- ipython-genutils==0.2.0
- jupyter-client==5.1.0
- jupyter-console==5.2.0
- jupyter-core==4.3.0
- markdown==2.6.9
- prompt-toolkit==1.0.15
- protobuf==3.4.0
- tensorflow==1.3.0
- tensorflow-tensorboard==0.1.6
- werkzeug==0.12.2
- win-unicode-console==0.5
prefix: C:\Users\varun\Anaconda3\envs\tfdeeplearning
I'm using Anaconda 3 and conda version is 4.7.12. I'm on Windows10 machine. The purpose of this to install tensorflow along with all it's dependencies.
Same error for me. Also in Windows 10, with Anaconda 3 (2019.10) and Python 3.7 (all 64bit). Here is my output:
Collecting package metadata (repodata.json): done
Solving environment: failed
ResolvePackageNotFound:
- notebook==5.0.0=py35_0
- python-dateutil==2.6.1=py35_0
- wcwidth==0.1.7=py35_0
- testpath==0.3.1=py35_0
- libpng==1.6.30=vc14_1
- nbformat==4.4.0=py35_0
- tornado==4.5.2=py35_0
- numpy==1.13.1=py35_0
- setuptools==36.4.0=py35_1
- zlib==1.2.11=vc14_0
- html5lib==0.9999999=py35_0
- wheel==0.29.0=py35_0
- ipython==6.1.0=py35_0
- simplegeneric==0.8.1=py35_1
- ipykernel==4.6.1=py35_0
- colorama==0.3.9=py35_0
- jpeg==9b=vc14_0
- certifi==2016.2.28=py35_0
- scikit-learn==0.19.0=np113py35_0
- pip==9.0.1=py35_1
- ipython_genutils==0.2.0=py35_0
- jedi==0.10.2=py35_2
- tk==8.5.18=vc14_0
- mkl==2017.0.3=0
- icu==57.1=vc14_0
- pandas==0.20.3=py35_0
- qtconsole==4.3.1=py35_0
- widgetsnbextension==3.0.2=py35_0
- pickleshare==0.7.4=py35_0
- jupyter_console==5.2.0=py35_0
- bleach==1.5.0=py35_0
- jupyter_client==5.1.0=py35_0
- ipywidgets==6.0.0=py35_0
- openssl==1.0.2l=vc14_0
- pandocfilters==1.4.2=py35_0
- qt==5.6.2=vc14_6
- win_unicode_console==0.5=py35_0
- pytz==2017.2=py35_0
- pyzmq==16.0.2=py35_0
- pyqt==5.6.0=py35_2
- decorator==4.1.2=py35_0
- path.py==10.3.1=py35_0
- jupyter==1.0.0=py35_3
- jsonschema==2.6.0=py35_0
- markupsafe==1.0=py35_0
- requests==2.14.2=py35_0
- jupyter_core==4.3.0=py35_0
- entrypoints==0.2.3=py35_0
- six==1.10.0=py35_1
- cycler==0.10.0=py35_0
- mistune==0.7.4=py35_0
- scipy==0.19.1=np113py35_0
- traitlets==4.3.2=py35_0
- vs2015_runtime==14.0.25420=0
- wincertstore==0.2=py35_0
- matplotlib==2.0.2=np113py35_0
- nbconvert==5.2.1=py35_0
- python==3.5.4=0
- jinja2==2.9.6=py35_0
- pygments==2.2.0=py35_0
- prompt_toolkit==1.0.15=py35_0
- pyparsing==2.2.0=py35_0
- sip==4.18=py35_0
After several unsuccessful attempts to install from the provided tfdl_env.yml file, I desisted and just proceeded to install the needed packages myself with conda install <PACKAGE>. I also then found that some specified package versions in the provided file were not really up to date and conda was unable to find such versions. I'm actually quite dissapointed with this Anaconda environments system, as it seems to be just an "environment clone tool" for the very machine of the user who creates the environment, but they are definitely not portable at all, as one would expect.
However, maybe now I made it work in my Windows 10 you can try it as well. Here is an environment.yml file I created myself from my installation, which is fully working as far as I can tell (I'm already following Section 5 of the course):
name: tfdeeplearning
channels:
- defaults
dependencies:
- _tflow_select=2.2.0=eigen
- absl-py=0.9.0=py37_0
- asn1crypto=1.3.0=py37_0
- astor=0.8.0=py37_0
- attrs=19.3.0=py_0
- backcall=0.1.0=py37_0
- blas=1.0=mkl
- bleach=3.1.0=py37_0
- blinker=1.4=py37_0
- ca-certificates=2020.1.1=0
- cachetools=3.1.1=py_0
- certifi=2019.11.28=py37_0
- cffi=1.14.0=py37h7a1dbc1_0
- chardet=3.0.4=py37_1003
- click=7.0=py37_0
- colorama=0.4.3=py_0
- cryptography=2.8=py37h7a1dbc1_0
- cycler=0.10.0=py37_0
- decorator=4.4.1=py_0
- defusedxml=0.6.0=py_0
- entrypoints=0.3=py37_0
- freetype=2.9.1=ha9979f8_1
- gast=0.2.2=py37_0
- google-auth=1.11.2=py_0
- google-auth-oauthlib=0.4.1=py_2
- google-pasta=0.1.8=py_0
- grpcio=1.27.2=py37h351948d_0
- h5py=2.10.0=py37h5e291fa_0
- hdf5=1.10.4=h7ebc959_0
- icc_rt=2019.0.0=h0cc432a_1
- icu=58.2=ha66f8fd_1
- idna=2.8=py37_0
- importlib_metadata=1.5.0=py37_0
- intel-openmp=2020.0=166
- ipykernel=5.1.4=py37h39e3cac_0
- ipython=7.12.0=py37h5ca1d4c_0
- ipython_genutils=0.2.0=py37_0
- ipywidgets=7.5.1=py_0
- jedi=0.16.0=py37_0
- jinja2=2.11.1=py_0
- joblib=0.14.1=py_0
- jpeg=9b=hb83a4c4_2
- jsonschema=3.2.0=py37_0
- jupyter=1.0.0=py37_7
- jupyter_client=5.3.4=py37_0
- jupyter_console=6.1.0=py_0
- jupyter_core=4.6.1=py37_0
- keras-applications=1.0.8=py_0
- keras-preprocessing=1.1.0=py_1
- kiwisolver=1.1.0=py37ha925a31_0
- libpng=1.6.37=h2a8f88b_0
- libprotobuf=3.11.4=h7bd577a_0
- libsodium=1.0.16=h9d3ae62_0
- m2w64-gcc-libgfortran=5.3.0=6
- m2w64-gcc-libs=5.3.0=7
- m2w64-gcc-libs-core=5.3.0=7
- m2w64-gmp=6.1.0=2
- m2w64-libwinpthread-git=5.0.0.4634.697f757=2
- markdown=3.1.1=py37_0
- markupsafe=1.1.1=py37he774522_0
- matplotlib=3.1.3=py37_0
- matplotlib-base=3.1.3=py37h64f37c6_0
- mistune=0.8.4=py37he774522_0
- mkl=2020.0=166
- mkl-service=2.3.0=py37hb782905_0
- mkl_fft=1.0.15=py37h14836fe_0
- mkl_random=1.1.0=py37h675688f_0
- msys2-conda-epoch=20160418=1
- nbconvert=5.6.1=py37_0
- nbformat=5.0.4=py_0
- notebook=6.0.3=py37_0
- numpy=1.18.1=py37h93ca92e_0
- numpy-base=1.18.1=py37hc3f5095_1
- oauthlib=3.1.0=py_0
- openssl=1.1.1d=he774522_4
- opt_einsum=3.1.0=py_0
- pandas=1.0.1=py37h47e9c7a_0
- pandoc=2.2.3.2=0
- pandocfilters=1.4.2=py37_1
- parso=0.6.1=py_0
- pip=20.0.2=py37_1
- prometheus_client=0.7.1=py_0
- prompt_toolkit=3.0.3=py_0
- protobuf=3.11.4=py37h33f27b4_0
- pyasn1=0.4.8=py_0
- pyasn1-modules=0.2.7=py_0
- pycparser=2.19=py37_0
- pygments=2.5.2=py_0
- pyjwt=1.7.1=py37_0
- pyopenssl=19.1.0=py37_0
- pyparsing=2.4.6=py_0
- pyqt=5.9.2=py37h6538335_2
- pyreadline=2.1=py37_1
- pyrsistent=0.15.7=py37he774522_0
- pysocks=1.7.1=py37_0
- python=3.7.6=h60c2a47_2
- python-dateutil=2.8.1=py_0
- pytz=2019.3=py_0
- pywinpty=0.5.7=py37_0
- pyzmq=18.1.1=py37ha925a31_0
- qt=5.9.7=vc14h73c81de_0
- qtconsole=4.6.0=py_1
- requests=2.22.0=py37_1
- requests-oauthlib=1.3.0=py_0
- rsa=4.0=py_0
- scikit-learn=0.22.1=py37h6288b17_0
- scipy=1.4.1=py37h9439919_0
- send2trash=1.5.0=py37_0
- setuptools=45.2.0=py37_0
- sip=4.19.8=py37h6538335_0
- six=1.14.0=py37_0
- sqlite=3.31.1=he774522_0
- tensorboard=2.1.0=py3_0
- tensorflow=1.15.0=eigen_py37h9f89a44_0
- tensorflow-base=1.15.0=eigen_py37h07d2309_0
- tensorflow-estimator=1.15.1=pyh2649769_0
- termcolor=1.1.0=py37_1
- terminado=0.8.3=py37_0
- testpath=0.4.4=py_0
- tornado=6.0.3=py37he774522_3
- traitlets=4.3.3=py37_0
- urllib3=1.25.8=py37_0
- vc=14.1=h0510ff6_4
- vs2015_runtime=14.16.27012=hf0eaf9b_1
- wcwidth=0.1.8=py_0
- webencodings=0.5.1=py37_1
- werkzeug=0.16.1=py_0
- wheel=0.34.2=py37_0
- widgetsnbextension=3.5.1=py37_0
- win_inet_pton=1.1.0=py37_0
- wincertstore=0.2=py37_0
- winpty=0.4.3=4
- wrapt=1.11.2=py37he774522_0
- zeromq=4.3.1=h33f27b4_3
- zipp=2.2.0=py_0
- zlib=1.2.11=h62dcd97_3
- pip:
- ipython-genutils==0.2.0
- jupyter-client==6.0.0
- jupyter-core==4.6.3
- pickleshare==0.7.5
- pywin32==227
prefix: C:\Users\jose_\Anaconda3\envs\tfdeeplearning
Just copy the content to an environment.yml file in your box and do conda env create -f environment.yml.
Also, check out the last line prefix, where you'll have to modify the path to match yours (probably just substitute jose_). As I said before, this Conda environments tool doesn't seem suitable to create portable environments to be distributed to different machines.
Good luck.
I'm using a conda for package management and including an environment.yml file and requirements.txt file for deployment with help from this post. I've gotten simple dash apps to deploy this way, but for a more complex task which requires GDAL, the build 'succeeds' but the app crashes with the following log:
Starting process with command `gunicorn app:server --log-file=-
heroku[web.1]: Process exited with status 127
app[web.1]: bash: gunicorn: command not found
app[api]: Build succeeded
My environment.yml file calls for:
name: dashpilot
channels:
- conda-forge
- anaconda-fusion
- defaults
dependencies:
- asn1crypto=0.24.0=py36_1003
- attrs=18.2.0=py_0
- bzip2=1.0.6=1
- ca-certificates=2018.10.15=ha4d7672_0
- certifi=2018.10.15=py36_1000
- cffi=1.11.5=py36h5e8e0c9_1
- chardet=3.0.4=py36_1003
- click=7.0=py_0
- cryptography-vectors=2.3.1=py36_1000
- dash=0.30.0=py_0
- dash-core-components=0.38.0=py_0
- dash-html-components=0.13.2=py_0
- dash-renderer=0.15.0=py_0
- decorator=4.3.0=py_0
- flask=1.0.2=py_2
- flask-compress=1.4.0=py_0
- geojson=2.4.1=py_0
- idna=2.7=py36_1002
- ipython_genutils=0.2.0=py_1
- itsdangerous=1.1.0=py_0
- jinja2=2.10=py_1
- jsonschema=3.0.0a3=py36_1000
- jupyter_core=4.4.0=py_0
- markupsafe=1.1.0=py36h470a237_0
- nbformat=4.4.0=py_1
- openssl=1.0.2p=h470a237_1
- plotly=3.4.1=py_0
- pycparser=2.19=py_0
- pyopenssl=18.0.0=py36_1000
- pyrsistent=0.14.7=py36h470a237_0
- pysocks=1.6.8=py36_1002
- pytz=2018.7=py_0
- requests=2.20.1=py36_1000
- retrying=1.3.3=py_2
- six=1.11.0=py36_1001
- traitlets=4.3.2=py36_1000
- urllib3=1.23=py36_1001
- werkzeug=0.14.1=py_0
- blas=1.0=mkl
- cairo=1.14.12=hc4e6be7_4
- click-plugins=1.0.4=py36_0
- cligj=0.5.0=py36_0
- cryptography=2.3.1=py36hdbc3d79_0
- curl=7.61.1=ha441bb4_0
- cycler=0.10.0=py36hfc81398_0
- descartes=1.1.0=py36_0
- expat=2.2.6=h0a44026_0
- fiona=1.7.12=py36h0dff353_0
- fontconfig=2.13.0=h5d5b041_1
- freetype=2.9.1=hb4e5f40_0
- freexl=1.0.5=h1de35cc_0
- gdal=2.2.4=py36h6440ff4_1
- geopandas=0.3.0=py36_0
- geos=3.6.2=h5470d99_2
- gettext=0.19.8.1=h15daf44_3
- giflib=5.1.4=h1de35cc_1
- glib=2.56.2=hd9629dc_0
- gunicorn=19.9.0=py36_0
- hdf4=4.2.13=h39711bb_2
- hdf5=1.10.2=hfa1e0ec_1
- icu=58.2=h4b95b61_1
- intel-openmp=2019.1=144
- jpeg=9b=he5867d9_2
- json-c=0.13.1=h3efe00b_0
- kealib=1.4.7=h40e48e4_6
- kiwisolver=1.0.1=py36h0a44026_0
- krb5=1.16.1=h24a3359_6
- libboost=1.67.0=hebc422b_4
- libcurl=7.61.1=hf30b1f0_0
- libcxx=4.0.1=hcfea43d_1
- libcxxabi=4.0.1=hcfea43d_1
- libdap4=3.19.1=h3d3e54a_0
- libedit=3.1.20170329=hb402a30_2
- libffi=3.2.1=h475c297_4
- libgdal=2.2.4=h7b1ea53_1
- libgfortran=3.0.1=h93005f0_2
- libiconv=1.15=hdd342a3_7
- libkml=1.3.0=hbe12b63_4
- libnetcdf=4.6.1=h4e6abe9_2
- libpng=1.6.35=ha441bb4_0
- libpq=10.5=hf30b1f0_0
- libspatialindex=1.8.5=h2c08c6b_2
- libspatialite=4.3.0a=ha12ebda_19
- libssh2=1.8.0=h322a93b_4
- libtiff=4.0.9=hcb84e12_2
- libuuid=1.0.3=h6bb4b03_2
- libxml2=2.9.8=hab757c2_1
- matplotlib=3.0.1=py36h54f8f79_0
- mkl=2018.0.3=1
- mkl_fft=1.0.6=py36hb8a8100_0
- mkl_random=1.0.1=py36h5d10147_1
- munch=2.3.2=py36_0
- ncurses=6.1=h0a44026_0
- numpy=1.15.4=py36h6a91979_0
- numpy-base=1.15.4=py36h8a80b8c_0
- openjpeg=2.3.0=hb95cd4c_1
- pandas=0.23.4=py36h6440ff4_0
- pcre=8.42=h378b8a2_0
- pip=18.1=py36_0
- pixman=0.34.0=hca0a616_3
- poppler=0.65.0=ha097c24_1
- poppler-data=0.4.9=0
- proj4=5.0.1=h1de35cc_0
- psycopg2=2.7.5=py36hdbc3d79_0
- pyparsing=2.3.0=py36_0
- pyproj=1.9.5.1=py36h833a5d7_1
- pysal=1.14.4.post1=py36_1
- python=3.6.6=hc167b69_0
- python-dateutil=2.7.5=py36_0
- readline=7.0=h1de35cc_5
- rtree=0.8.3=py36_0
- scipy=1.1.0=py36h28f7352_1
- setuptools=40.6.2=py36_0
- shapely=1.6.4=py36h20de77a_0
- sqlalchemy=1.2.14=py36h1de35cc_0
- sqlite=3.25.3=ha441bb4_0
- tk=8.6.8=ha441bb4_0
- tornado=5.1.1=py36h1de35cc_0
- wheel=0.32.3=py36_0
- xerces-c=3.2.2=h44e365a_0
- xz=5.2.4=h1de35cc_4
- zlib=1.2.11=h1de35cc_3
- pip:
- dash-table==3.1.6
prefix: /Applications/anaconda3/envs/dashpilot
My requirements.txt file calls for:
asn1crypto==0.24.0
attrs==18.2.0
certifi==2018.10.15
cffi==1.11.5
chardet==3.0.4
Click==7.0
click-plugins==1.0.4
cligj==0.5.0
cryptography==2.3.1
cryptography-vectors==2.3.1
cycler==0.10.0
dash==0.30.0
dash-core-components==0.38.0
dash-html-components==0.13.2
dash-renderer==0.15.0
dash-table==3.1.6
decorator==4.3.0
descartes==1.1.0
Fiona==1.7.12
Flask==1.0.2
Flask-Compress==1.4.0
GDAL==2.2.4
geojson==2.4.1
geopandas==0.3.0
gunicorn==19.7.1
idna==2.7
ipython-genutils==0.2.0
itsdangerous==1.1.0
Jinja2==2.10
jsonschema==3.0.0a3
jupyter-core==4.4.0
kiwisolver==1.0.1
MarkupSafe==1.1.0
matplotlib==3.0.1
mkl-fft==1.0.6
mkl-random==1.0.1
munch==2.3.2
nbformat==4.4.0
numpy==1.15.4
pandas==0.23.4
plotly==3.4.1
psycopg2==2.7.5
pycparser==2.19
pyOpenSSL==18.0.0
pyparsing==2.3.0
pyproj==1.9.5.1
pyrsistent==0.14.7
PySAL==1.14.4.post1
PySocks==1.6.8
python-dateutil==2.7.5
pytz==2018.7
requests==2.20.1
retrying==1.3.3
Rtree==0.8.3
scipy==1.1.0
Shapely==1.6.4.post1
six==1.11.0
SQLAlchemy==1.2.14
tornado==5.1.1
traitlets==4.3.2
urllib3==1.23
Werkzeug==0.14.1
I've tried all sorts of combinations of buildpacks (heroku's python build pack and the gdal buildpack recommended in heroku's docs). The app works when running 'heroku local' on my mac when solely using the recommended gdal buildpack. I'm new to the the deployment process, so unclear where I might be going astray.
Will add that my config vars are currently these:
BUILD_WITH_GEO_LIBRARIES=1
GDAL_LIBRARY_PATH=os.environ.get('GDAL_LIBRARY_PATH')
GEOS_LIBRARY_PATH=os.environ.get('GEOS_LIBRARY_PATH')
WEB_CONCURRENCY=3
I'm using Apache Storm 0.10.0-beta1 and started converting some topologies to Flux. I decided to start with a simple topology that reads from a Kafka queue and writes to a different Kafka queue. I get this error, which I am having a difficult time figuring out what is wrong. The topology yaml file follows the error.
Parsing file: /Users/frank/src/mapper/mapper.yaml
388 [main] INFO o.a.s.f.p.FluxParser - loading YAML from input stream...
391 [main] INFO o.a.s.f.p.FluxParser - Not performing property substitution.
391 [main] INFO o.a.s.f.p.FluxParser - Not performing environment variable substitution.
466 [main] INFO o.a.s.f.FluxBuilder - Detected DSL topology...
Exception in thread "main" java.lang.NullPointerException
at org.apache.storm.flux.FluxBuilder.canInvokeWithArgs(FluxBuilder.java:561)
at org.apache.storm.flux.FluxBuilder.findCompatibleConstructor(FluxBuilder.java:392)
at org.apache.storm.flux.FluxBuilder.buildObject(FluxBuilder.java:288)
at org.apache.storm.flux.FluxBuilder.buildSpout(FluxBuilder.java:361)
at org.apache.storm.flux.FluxBuilder.buildSpouts(FluxBuilder.java:349)
at org.apache.storm.flux.FluxBuilder.buildTopology(FluxBuilder.java:84)
at org.apache.storm.flux.Flux.runCli(Flux.java:153)
at org.apache.storm.flux.Flux.main(Flux.java:98)
Topology yaml:
name: "mapper-topology"
config:
topology.workers: 1
topology.debug: true
kafka.broker.properties.metadata.broker.list: "localhost:9092"
kafka.broker.properties.request.required.acks: "1"
kafka.broker.properties.serializer.class: "kafka.serializer.StringEncoder"
# component definitions
components:
- id: "topicSelector"
className: "storm.kafka.bolt.selector.DefaultTopicSelector"
constructorArgs:
- "schemaq"
- id: "kafkaMapper"
className: "storm.kafka.bolt.mapper.FieldNameBasedTupleToKafkaMapper"
# spout definitions
spouts:
- id: "kafka-spout"
className: "storm.kafka.SpoutConfig"
parallelism: 1
constructorArgs:
- ref: "zkHosts"
- "mapperq"
- "/mapperq"
- "id-mapperq"
properties:
- name: "forceFromStart"
value: true
- name: "scheme"
ref: "stringMultiScheme"
# bolt definitions
bolts:
- id: "kafka-bolt"
className: "storm.kafka.bolt.KafkaBolt"
parallelism: 1
configMethods:
- name: "withTopicSelector"
args: [ref: "topicSelector"]
- name: "withTupleToKafkaMapper"
args: [ref: "kafkaMapper"]
# streams
streams:
- name: "kafka-spout --> kafka-bolt"
from: "kafka-spout"
to: "kafka-bolt"
grouping:
type: SHUFFLE
And here is the command:
storm jar /Users/frank/src/mapper/target/mapper-0.1.0-SNAPSHOT-standalone.jar org.apache.storm.flux.Flux --local mapper.yaml
spout classname should be storm.kafka.KafkaSpout, not storm.kafka.SpoutConfig. You should define SpoutConfig to "components" section, and let spout refer this.
You can refer https://github.com/apache/storm/blob/master/external/flux/flux-examples/src/main/resources/kafka_spout.yaml to see how to setup KafkaSpout from flux.