To index geojson data to elasticsearch using curl - elasticsearch
I'd like to index a geojson data to elasticsearch using curl
The geojson data looks like this:
{
"type": "FeatureCollection",
"name": "telco_new_development",
"crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:OGC:1.3:CRS84" } },
"features": [
{ "type": "Feature", "properties": { "ogc_fid": 1, "name": "Yarrabilba", "carrier_name": "OptiComm", "uid": "35", "development_name": "Yarrabilba", "stage": "None", "developer_name": "Refer to Carrier", "development_nature": "Residential", "development_type": "Sub-division", "estimated_number_of_lots_or_units": "18500", "status": "Ready for service", "developer_application_date": "Check with carrier", "contract_date": "TBC", "estimated_service_date": "30 Jul 2013", "technology_type": "FTTP", "last_modified_date": "8 Jul 2020" }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 153.101112, -27.797998 ], [ 153.09786, -27.807122 ], [ 153.097715, -27.816313 ], [ 153.100598, -27.821068 ], [ 153.103789, -27.825047 ], [ 153.106079, -27.830225 ], [ 153.108248, -27.836107 ], [ 153.110692, -27.837864 ], [ 153.116288, -27.840656 ], [ 153.119923, -27.844818 ], [ 153.122317, -27.853523 ], [ 153.127785, -27.851777 ], [ 153.131234, -27.85115 ], [ 153.135634, -27.849741 ], [ 153.138236, -27.848668 ], [ 153.141703, -27.847075 ], [ 153.152205, -27.84496 ], [ 153.155489, -27.843381 ], [ 153.158613, -27.841546 ], [ 153.161937, -27.84059 ], [ 153.156361, -27.838492 ], [ 153.157097, -27.83451 ], [ 153.15036, -27.832705 ], [ 153.151126, -27.827536 ], [ 153.15169, -27.822564 ], [ 153.148492, -27.820801 ], [ 153.148375, -27.817969 ], [ 153.139019, -27.815804 ], [ 153.139814, -27.808556 ], [ 153.126486, -27.80576 ], [ 153.124679, -27.803584 ], [ 153.120764, -27.802953 ], [ 153.121397, -27.797353 ], [ 153.100469, -27.79362 ], [ 153.099828, -27.793327 ], [ 153.101112, -27.797998 ] ] ] ] } },
{ "type": "Feature", "properties": { "ogc_fid": 2, "name": "Elliot Springs", "carrier_name": "OptiComm", "uid": "63", "development_name": "Elliot Springs", "stage": "None", "developer_name": "Refer to Carrier", "development_nature": "Residential", "development_type": "Sub-division", "estimated_number_of_lots_or_units": "11674", "status": "Ready for service", "developer_application_date": "Check with carrier", "contract_date": "TBC", "estimated_service_date": "29 Nov 2018", "technology_type": "FTTP", "last_modified_date": "8 Jul 2020" }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 146.862725, -19.401424 ], [ 146.865987, -19.370253 ], [ 146.872767, -19.370901 ], [ 146.874484, -19.354706 ], [ 146.874913, -19.354301 ], [ 146.877059, -19.356811 ], [ 146.87972, -19.35835 ], [ 146.889161, -19.359321 ], [ 146.900062, -19.367581 ], [ 146.884955, -19.38507 ], [ 146.88341, -19.402558 ], [ 146.862725, -19.401424 ] ] ] ] } },
...
However, my curl is returns an error called The bulk request must be terminated by a newline [\\n]
curl -H 'Content-Type: application/x-ndjson' -XPOST 'localhost:9200/geo/building/_bulk?pretty' --data-binary #building.geojson
{
"error" : {
"root_cause" : [
{
"type" : "illegal_argument_exception",
"reason" : "The bulk request must be terminated by a newline [\\n]"
}
],
"type" : "illegal_argument_exception",
"reason" : "The bulk request must be terminated by a newline [\\n]"
},
"status" : 400
}
Any suggestion?
your format is not suitable for _bulk like that, as it's missing the structure it expects. https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-bulk.html goes into that
you need;
to update your json file to have something like { "index" : { "_index" : "INDEX-NAME-HERE" } } before each of the documents
each document also needs to be on a single line
each line needs a \n at the end of it so that the bulk API knows when the action/record ends
Related
Setup a Kibana dashboard for data about last Jenkins build
I use Kibana to show data about automated test cases stored in a single elastic search index. These tests can be repeated multiple times during the day and right now are identified by a build number that comes from Jenkins. So, if I want to see the latest results, I need to add a filter in my dashboards where I set the last known value of the build number. Is there a way to automatically show in a dashboard the values about the last build? Thank you. EDIT: Here's a data sample: { "_index": "data", "_type": "_doc", "_id": "33rugH0B0CwJH7IcV11v", "_score": 1, "_source": { "market": "FRA", "price_code": "DIS22FREH1003", "test_case_id": "NPM_14", "environment": "PROD", "cruise_id": "DI20220707CPVCP1", "jenkins_job_name": "MonitoringNPM_14", "#timestamp": "2021-12-03T16:34:03.360+0100", "jenkins_job_number": 8, "agency": "FR900000", "fail_code": "IncorrectGuarantee", "build_number": 8, "category": "IR2" }, "fields": { "environment.keyword": [ "PROD" ], "test_case_id": [ "NPM_14" ], "category.keyword": [ "IR2" ], "price_code": [ "DIS22FREH1003" ], "cruise_id": [ "DI20220707CPVCP1" ], "price_code.keyword": [ "DIS22FREH1003" ], "agency": [ "FR900000" ], "jenkins_job_number": [ "8" ], "agency.keyword": [ "FR900000" ], "jenkins_job_number.keyword": [ "8" ], "market": [ "FRA" ], "jenkins_job_name.keyword": [ "MonitoringNPM_14" ], "test_case_id.keyword": [ "NPM_14" ], "environment": [ "PROD" ], "#timestamp": [ "2021-12-03T15:34:03.360Z" ], "jenkins_job_name": [ "MonitoringNPM_14" ], "fail_code.keyword": [ "IncorrectGuarantee" ], "fail_code": [ "IncorrectGuarantee" ], "build_number": [ 8 ], "market.keyword": [ "FRA" ], "cruise_id.keyword": [ "DI20220707CPVCP1" ], "category": [ "IR2" ] } }
what is the json mapping to insert geo data into elasticsearch?
what would be the json mapping to insert geo data into elasticsearch ?? if the sample json data as follows: { "type": "Feature", "properties": { "ID": 631861455.000000, "address": "1206 UPPER", "city": "la vegas", "state": "AL", "zip_code": "15656", "OGR_GEOMETRY": "POLYGON" }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -86.477551331, 32.490605650000099 ], [ -86.477637350999899, 32.4903921820001 ], [ -86.478257247, 32.490565591000099 ], [ -86.478250466, 32.490580239000103 ], [ -86.478243988, 32.490593680000096 ], [ -86.47823751, 32.490607122 ], [ -86.478231749, 32.490619100000096 ], [ -86.478224637, 32.490634065000101 ], [ -86.47821823699999, 32.490647540000097 ], [ -86.478211847999901, 32.490661035000095 ], [ -86.478205478999897, 32.490674526000099 ], [ -86.478202107999891, 32.490681666000093 ], [ -86.478199132, 32.4906880240001 ], [ -86.478192825999898, 32.490701523 ], [ -86.478186533, 32.490715047 ], [ -86.47818320899999, 32.490722209000097 ], [ -86.47818027999989, 32.490728569000098 ], [ -86.478174063, 32.490742125000097 ], [ -86.47816785099999, 32.490755654000097 ], [ -86.47816255799999, 32.490767236000096 ], [ -86.478159053999889, 32.490774513000105 ], [ -86.477551331, 32.490605650000099 ] ] ] } }
Look at Geo point mapping. You need to define mapping.
Error in importing geojson polygon into RethinkDB
I have following geojson polygon JSON that I'd like to import into rethinkDB. I attempted to use following r.geojson approach to import (refer to Building an earthquake map with RethinkDB and GeoJSON): r.db("strongloop").table("region").insert( r.http("91231cd2.ngrok.io/data/geojson/MP14_REGION_WEB_PL_FLAT.json")("features") .merge(function(zone) { return { zone: r.geojson(zone("geometry")) } })) This approach gives me following error: RqlRuntimeError: Invalid LinearRing. Are there antipodal or duplicate vertices? Is it self-intersecting? in: r.db("strongloop").table("region").insert(r.http("91231cd2.ngrok.io/data/geojson/MP14_REGION_WEB_PL_FLAT.json")("features").merge(function(var_63) { return {zone: r.geojson(var_63("geometry"))}; })) I suspect it is because the geojson comes from flattened multipolygon (done using geojson.io Meta feature since RethinkDB does not support multipolygon) - but visually, the polygon is what I expected. I've also attempted to use r.polygon approach as following: r.db("strongloop").table("region").insert( r.http("91231cd2.ngrok.io/data/geojson/MP14_REGION_WEB_PL_FLAT.json")("features") .merge(function(zone) { return { zone: r.polygon(zone("geometry")("coordinates")) } })) but RethinkDB expects me to give array of points RqlCompileError: Expected 3 or more arguments but found 1 in: r.db("strongloop").table("region").insert(r.http("91231cd2.ngrok.io/data/geojson/MP14_REGION_WEB_PL_FLAT.json")("features").merge(function(var_64) { return {zone: r.polygon(var_64("geometry")("coordinates"))}; })) I could not figure out how to extract array of geometry coordinates before passing to r.polygon using code above. How should I resolve this? Is there any better way?
Check URL What happens if you do the http request by itself? I tried it and g0t a 404 on that resource. I would run that command first and make sure it works: r.http("91231cd2.ngrok.io/data/geojson/MP14_REGION_WEB_PL_FLAT.json") Try r.args r.polygon expects 3 arguments, but you're passing it one. You might try using r.args to spread those arguments in the function (similar to Function.apply, if you're a JavaScript guy) r.db("strongloop").table("region").insert( r.http("91231cd2.ngrok.io/data/geojson/MP14_REGION_WEB_PL_FLAT.json") ("features") .merge(function(zone) { return { zone: r.polygon(r.args(zone("geometry")("coordinates"))) } })) Invalid Polygons It seems that one of your polygons might be invalid. If you use the input from the other answer you posted and you use it the following function, it will work: r.expr(ARRAY_FROM_OTHER_ANSWER_WITH_POLYGONS).do(function (arr) { // Remove middle element return arr.slice(0, 2).add(arr.slice(3, 5)) // Map all elements to polygons // Make sure you pass an array of LON/LATs into `r.polygon` .map(function (row) { return r.polygon(r.args(row('geometry')('coordinates')(0))); }); }) Perhaps you can insert them one by one and catch the ones that failed. The folowing query fails: r.json(r.http('https://gist.githubusercontent.com/tekoadjaib/9e0f0729c050b69b283f/raw/6950765ed4a931b9b208e69a3c39b3114be5c7e3/map.geojson'))('features') .map(function (row) { return r.polygon(r.args(row('geometry')('coordinates')(0))) }) While this one (slicing the first 17 elements) doesn't. r.json(r.http('https://gist.githubusercontent.com/tekoadjaib/9e0f0729c050b69b283f/raw/6950765ed4a931b9b208e69a3c39b3114be5c7e3/map.geojson'))('features') .slice(0, 17) .map(function (row) { return r.polygon(r.args(row('geometry')('coordinates')(0))) })
Running this command on RethinkDB Data Explorer r.http("6f892736.ngrok.io/data/geojson/MP14_REGION_WEB_PL_FLAT.json") ("features") I received (removed some to fit SO limit) [ { "geometry": { "coordinates": [ [ [ 103.84874965353661, 1.363027350968694 ], [ 103.84924291873233, 1.362752820720889 ], [ 103.84935645049902, 1.362682483934137 ], [ 103.84973091592526, 1.362406260504349 ], [ 103.84992386166968, 1.362263929416379 ], [ 103.85137091514449, 1.361102941278631 ], [ 103.85194295836556, 1.360652909851773 ], [ 103.8522672818684, 1.360326981650605 ], [ 103.82714577966834, 1.241938910658531 ], [ 103.82714429349235, 1.241945678019855 ] ] ], "type": "Polygon" }, "properties": { "FMEL_UPD_D": "2014/12/05", "INC_CRC": "F6D4903B6C0B72F8", "OBJECTID": 1, "REGION_C": "CR", "REGION_N": "CENTRAL REGION", "SHAPE_Area": 136405631.404, "SHAPE_Leng": 131065.464453, "X_ADDR": 27836.5573, "Y_ADDR": 31929.9186 }, "type": "Feature" }, { "geometry": { "coordinates": [ [ [ 103.82233409716747, 1.247288081083765 ], [ 103.82235533213682, 1.247267117084279 ], [ 103.82236522405806, 1.247257351790808 ], [ 103.82239090120945, 1.24726873242952 ], [ 103.82242101348092, 1.247282079176339 ], [ 103.82244356466394, 1.247292075218428 ], [ 103.8224842992999, 1.247310126516275 ], [ 103.82229812450149, 1.247323592638841 ], [ 103.82233409716747, 1.247288081083765 ] ] ], "type": "Polygon" }, "properties": { "FMEL_UPD_D": "2014/12/05", "INC_CRC": "F6D4903B6C0B72F8", "OBJECTID": 1, "REGION_C": "CR", "REGION_N": "CENTRAL REGION", "SHAPE_Area": 136405631.404, "SHAPE_Leng": 131065.464453, "X_ADDR": 27836.5573, "Y_ADDR": 31929.9186 }, "type": "Feature" }, { "geometry": { "coordinates": [ [ [ 103.81270485670082, 1.253738883481215 ], [ 103.81270819925057, 1.253736038375202 ], [ 103.81271747666578, 1.25372269002029 ], [ 103.81272796270179, 1.253696608214719 ], [ 103.81272550089241, 1.25367617858827 ], [ 103.81271663253291, 1.253660777180855 ], [ 103.81268489667742, 1.253637281524639 ], [ 103.81271738240808, 1.253596580832204 ], [ 103.81272993138482, 1.25358085854482 ], [ 103.81277884770714, 1.253527980705921 ], [ 103.81280844732579, 1.253491732178708 ], [ 103.81284783649771, 1.253444824881476 ], [ 103.81289710415396, 1.253390129269173 ], [ 103.81268859863616, 1.253749463519217 ], [ 103.81269418659909, 1.253747966839359 ], [ 103.81269933160983, 1.253743587040128 ], [ 103.81270485670082, 1.253738883481215 ] ] ], "type": "Polygon" }, "properties": { "FMEL_UPD_D": "2014/12/05", "INC_CRC": "F6D4903B6C0B72F8", "OBJECTID": 1, "REGION_C": "CR", "REGION_N": "CENTRAL REGION", "SHAPE_Area": 136405631.404, "SHAPE_Leng": 131065.464453, "X_ADDR": 27836.5573, "Y_ADDR": 31929.9186 }, "type": "Feature" }, { "geometry": { "coordinates": [ [ [ 103.81422064298785, 1.252570070111331 ], [ 103.81420878069888, 1.252558437176131 ], [ 103.8142083125649, 1.252558523991746 ], [ 103.81420773751474, 1.252557413427196 ], [ 103.81419024508641, 1.252540259301697 ], [ 103.81418525918728, 1.252535369363019 ], [ 103.81417827319876, 1.252525659143691 ], [ 103.81417483815903, 1.252520884975136 ], [ 103.81417401960785, 1.25251974727783 ], [ 103.8141688845694, 1.252512609990775 ], [ 103.81416372167686, 1.252505434720186 ], [ 103.81415922015667, 1.252490646509147 ], [ 103.81415911592985, 1.252490302849641 ], [ 103.81415883649417, 1.252489384917013 ], [ 103.81415845013613, 1.252488116994314 ], [ 103.8141580341274, 1.252486750495618 ], [ 103.8141581823891, 1.252486131910979 ], [ 103.81415836569447, 1.252485366819468 ], [ 103.81415859033339, 1.252484429898899 ], [ 103.81415863076748, 1.252484387394026 ], [ 103.81415862268139, 1.252484294244369 ], [ 103.81416084480966, 1.252475025422944 ], [ 103.81416187994584, 1.25247070708728 ], [ 103.81416579397985, 1.252464651487459 ], [ 103.81417344413717, 1.25245281611786 ], [ 103.81417638235824, 1.25244826989617 ], [ 103.81418792868888, 1.252415206393471 ], [ 103.81423209597757, 1.252398253489821 ], [ 103.8142345678362, 1.252397304828964 ], [ 103.81423924468973, 1.252395510602936 ], [ 103.8142416716218, 1.252394579124685 ], [ 103.81424197622378, 1.252394461559454 ], [ 103.81424641316956, 1.252392758672508 ], [ 103.81427168974326, 1.252383056828959 ], [ 103.81429695645653, 1.252370091131469 ], [ 103.81430622571014, 1.252365335143192 ], [ 103.81430795807904, 1.252364492287584 ], [ 103.81430790237042, 1.252364474199889 ], [ 103.8143179156083, 1.252359335670564 ], [ 103.81433202075975, 1.252352098139711 ], [ 103.81434014977647, 1.252347926362841 ], [ 103.81435231859189, 1.252341681810603 ], [ 103.81435985727103, 1.252338114145249 ], [ 103.81437420681739, 1.25233151509769 ], [ 103.81439499523958, 1.252321955205564 ], [ 103.81439606179633, 1.252321465047431 ], [ 103.81439624419711, 1.252321532876141 ], [ 103.81440192381508, 1.25231876917771 ], [ 103.81441291824507, 1.252313712952395 ], [ 103.81442976570837, 1.252305965378984 ], [ 103.81444317808666, 1.252299726261537 ], [ 103.81445277979405, 1.252295146626335 ], [ 103.81447443170696, 1.252284820742596 ], [ 103.8144961555022, 1.252274461397674 ], [ 103.81450288101018, 1.252271253663335 ], [ 103.81450281451966, 1.252271173174363 ], [ 103.81452463535528, 1.252260878944195 ], [ 103.81458959107665, 1.25223203649656 ], [ 103.81461125376586, 1.252222416920872 ], [ 103.81464389561143, 1.252207399268555 ], [ 103.81465169486319, 1.252203810803814 ], [ 103.81469601222396, 1.252183421305906 ], [ 103.81472344798394, 1.252170002532816 ], [ 103.81478302601083, 1.252140862505733 ], [ 103.81484031637163, 1.252113379257888 ], [ 103.81490026635718, 1.25208846442443 ], [ 103.81491445851239, 1.252102786050432 ], [ 103.81491771565472, 1.252106072536219 ], [ 103.81491908860134, 1.25210674629792 ], [ 103.81493128425387, 1.252117179139818 ], [ 103.8149446623662, 1.2521281301912 ], [ 103.81496569689162, 1.252131490055921 ], [ 103.81497133065815, 1.25213238993891 ], [ 103.81498839193847, 1.252128956184855 ], [ 103.81500279627888, 1.252124305136556 ], [ 103.8150235712083, 1.252116328782531 ], [ 103.81502467283144, 1.252112382141001 ], [ 103.8150250430265, 1.252112184992011 ], [ 103.81502909456997, 1.252100774646097 ], [ 103.81503503665392, 1.252084043029215 ], [ 103.81503408874394, 1.252078647580404 ], [ 103.81503511578735, 1.25207496772605 ], [ 103.81503518952596, 1.252066416954603 ], [ 103.81502759086356, 1.252038774077729 ], [ 103.81502703827032, 1.252038532608371 ], [ 103.81502617751417, 1.252033633656605 ], [ 103.81501061513713, 1.252014397701631 ], [ 103.81497715415199, 1.251984291152084 ], [ 103.81496625504562, 1.251978257151349 ], [ 103.8149622584071, 1.251976044141786 ], [ 103.8149536020188, 1.251971250945737 ], [ 103.8149359747586, 1.25196149181759 ], [ 103.81491979321531, 1.251953915836878 ], [ 103.8149053934479, 1.251947173693286 ], [ 103.81487481213755, 1.251932856472991 ], [ 103.81486042674678, 1.251926121564181 ], [ 103.8148226958023, 1.251918231615303 ], [ 103.8148209230086, 1.251917860813026 ], [ 103.81482003795891, 1.251917765848395 ], [ 103.8148186775897, 1.251917391431592 ], [ 103.81480645223182, 1.251914834704565 ], [ 103.81478684994318, 1.251914215075008 ], [ 103.81478406270905, 1.25191412733175 ], [ 103.81476586212422, 1.251913551121548 ], [ 103.81475217215036, 1.251913118737449 ], [ 103.81473641375817, 1.251913570807558 ], [ 103.81469760451252, 1.25191528611085 ], [ 103.81464911173373, 1.251930252101506 ], [ 103.81464647993616, 1.251931415096138 ], [ 103.81464604415146, 1.251931198949751 ], [ 103.81463423207276, 1.251934844360703 ], [ 103.81462343982588, 1.251938175059867 ], [ 103.81461634052772, 1.251941862105349 ], [ 103.81459325275814, 1.251953853820537 ], [ 103.81457931114169, 1.251961094067155 ], [ 103.81457055045844, 1.251965643864866 ], [ 103.81450478872569, 1.251999797642535 ], [ 103.81446223335465, 1.252023820888956 ], [ 103.81441832030102, 1.2520486110264 ], [ 103.8143689396795, 1.25207648681698 ], [ 103.81429144309426, 1.252119919289534 ], [ 103.81421839961543, 1.252163028029699 ], [ 103.81418472441868, 1.252183801048487 ], [ 103.81416549402833, 1.252195662559566 ], [ 103.81411929141971, 1.252234061560852 ], [ 103.81407773413764, 1.252281223893357 ], [ 103.81405899059976, 1.252313446282806 ], [ 103.81403917762307, 1.252371587767593 ], [ 103.81403048853925, 1.252414490782956 ], [ 103.81403453146623, 1.252476150424537 ], [ 103.81404507986444, 1.252520744745098 ], [ 103.81406615890175, 1.252581120314285 ], [ 103.81410007800378, 1.252628997652677 ], [ 103.81413071763996, 1.252654609498039 ], [ 103.81415741467501, 1.252660750333722 ], [ 103.81415875168324, 1.252661057827697 ], [ 103.81415903202367, 1.252661122944038 ], [ 103.81416270520218, 1.25266196764807 ], [ 103.81416714302718, 1.252662988708921 ], [ 103.81416796517983, 1.252663120752286 ], [ 103.81420024580622, 1.252650676021723 ], [ 103.8142128693197, 1.252638786426654 ], [ 103.8142159252361, 1.252635907855009 ], [ 103.81421741504606, 1.252629051874232 ], [ 103.81422008102156, 1.252623340828278 ], [ 103.81422398521352, 1.252611535275082 ], [ 103.8142249584107, 1.252598998973394 ], [ 103.81422552184074, 1.252591729690962 ], [ 103.81422064298785, 1.252570070111331 ] ] ], "type": "Polygon" }, "properties": { "FMEL_UPD_D": "2014/12/05", "INC_CRC": "F6D4903B6C0B72F8", "OBJECTID": 1, "REGION_C": "CR", "REGION_N": "CENTRAL REGION", "SHAPE_Area": 136405631.404, "SHAPE_Leng": 131065.464453, "X_ADDR": 27836.5573, "Y_ADDR": 31929.9186 }, "type": "Feature" }, { "geometry": { "coordinates": [ [ [ 103.74130389714394, 1.159976941119677 ], [ 103.7413428293003, 1.159968251449955 ], [ 103.74129985127891, 1.159780169341412 ], [ 103.74126091912512, 1.159788858106384 ], [ 103.74130389714394, 1.159976941119677 ] ] ], "type": "Polygon" }, "properties": { "FMEL_UPD_D": "2014/12/05", "INC_CRC": "11540153B663CA9B", "OBJECTID": 5, "REGION_C": "WR", "REGION_N": "WEST REGION", "SHAPE_Area": 257110296.977, "SHAPE_Leng": 258264.026231, "X_ADDR": 12896.436, "Y_ADDR": 33986.5714 }, "type": "Feature" } ]
Using d3.nest() with geojson files
How is d3.nest() used with geojson files? My geojson data is formatted as follows: "features": [ { "type": "Feature", "properties": { "neighborhood": "Allerton", "boroughCode": "2", "borough": "Bronx", "#id": "http:\/\/nyc.pediacities.com\/Resource\/Neighborhood\/Allerton" }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -73.848597000000183, 40.871670000000115 ], [ -73.845822536836778, 40.870239076236174 ], [ -73.854559184633743, 40.859953835764252 ], [ -73.854665433068263, 40.859585694988056 ], [ -73.856388703358959, 40.857593635304482 ], [ -73.868881809153407, 40.857223150158326 ], [ -73.868317552728243, 40.857862062258313 ], [ -73.869553714672321, 40.857784095600181 ], [ -73.871024857620654, 40.857309948816905 ], [ -73.870480549987164, 40.865413584098484 ], [ -73.87055489856489, 40.869702798589863 ], [ -73.86721594442561, 40.869689663636713 ], [ -73.85745, 40.869533000000182 ], [ -73.855550000000108, 40.871813000000145 ], [ -73.853597967576576, 40.873288368674203 ], [ -73.848597000000183, 40.871670000000115 ] ] ] } } But my nest command: var nested_data = d3.nest() .key(function(d, i) { console.log(d); return d.features.properties.neighborhood; }) .entries(map); returns an empty array. I want to nest my data to more easily filter it. Is this advised?
Assuming your geojson looks like the below var map = { type: "FeatureCollection", "features": [ { "type": "Feature", "properties": { "neighborhood": "Allerton", "boroughCode": "2", "borough": "Bronx", "#id": "http:\/\/nyc.pediacities.com\/Resource\/Neighborhood\/Allerton" }, "geometry": { /* various coordinates, etc */ } ] } So, what you want to do is: d3.nest() .key(function(d, i) { return d.properties.neighborhood; }) .entries(map.features); You want to pass map.features since that's your array.
Logstash Grok Filter Apache Access Log
I have been looking around here and there, but could not find the working resolution. I try to use Grok Filter inside the Logstash config file to filter Apache-Access log file. The log message looks like this: {"message":"00.00.0.000 - - [dd/mm/YYYY:hh:mm:ii +0000] \"GET /index.html HTTP/1.1\" 200 00"}. On this moment I could only filter the client ip by using grok { match => [ "message", "%{IP:client_ip}" ] }. I want to filter: - The GET method, - requested page (index.html), - HTTP/1.1\, - server response 200 - the last number 00 after 200 inside the message body Please note that none of these does not work for me : grok { match => { "message" => "%{COMBINEDAPACHELOG}" } } or grok { match => [ "message", "%{COMBINEDAPACHELOG}" ] }
Use the Grok Debugger to get an exact match on your log format. Its the only way. http://grokdebug.herokuapp.com/
grok { match => [ "message", "%{IP:client_ip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:apache_timestamp}\] \"%{WORD:method} /%{NOTSPACE:request_page} HTTP/%{NUMBER:http_version}\" %{NUMBER:server_response} " ] }
Use the following: filter { grok { match => { "message" => "%{COMMONAPACHELOG}" } } } As you can see from your pattern COMBINEDAPACHELOG would fail because there are some missing components: COMBINEDAPACHELOG %{COMMONAPACHELOG} %{QS:referrer} %{QS:agent} https://github.com/elastic/logstash/blob/v1.4.2/patterns/grok-patterns
You can use COMBINEDAPACHELOG pattern for this, %{IPORHOST:clientip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})" %{NUMBER:response} (?:%{NUMBER:bytes}|-) %{QS:referrer} %{QS:agent} for instance, consider this sample apache log 111.222.333.123 HOME - [01/Feb/1998:01:08:46 -0800] "GET /bannerad/ad.htm HTTP/1.0" 200 28083 "http://www.referrer.com/bannerad/ba_intro.htm" "Mozilla/4.01 (Macintosh; I; PPC)" above filter will produce, { "clientip": [ [ "111.222.333.123" ] ], "HOSTNAME": [ [ "111.222.333.123" ] ], "IP": [ [ null ] ], "IPV6": [ [ null ] ], "IPV4": [ [ null ] ], "ident": [ [ "HOME" ] ], "USERNAME": [ [ "HOME", "-" ] ], "auth": [ [ "-" ] ], "timestamp": [ [ "01/Feb/1998:01:08:46 -0800" ] ], "MONTHDAY": [ [ "01" ] ], "MONTH": [ [ "Feb" ] ], "YEAR": [ [ "1998" ] ], "TIME": [ [ "01:08:46" ] ], "HOUR": [ [ "01" ] ], "MINUTE": [ [ "08" ] ], "SECOND": [ [ "46" ] ], "INT": [ [ "-0800" ] ], "verb": [ [ "GET" ] ], "request": [ [ "/bannerad/ad.htm" ] ], "httpversion": [ [ "1.0" ] ], "BASE10NUM": [ [ "1.0", "200", "28083" ] ], "rawrequest": [ [ null ] ], "response": [ [ "200" ] ], "bytes": [ [ "28083" ] ], "referrer": [ [ ""http://www.referrer.com/bannerad/ba_intro.htm"" ] ], "QUOTEDSTRING": [ [ ""http://www.referrer.com/bannerad/ba_intro.htm"", ""Mozilla/4.01 (Macintosh; I; PPC)"" ] ], "agent": [ [ ""Mozilla/4.01 (Macintosh; I; PPC)"" ] ] } can be tested here, https://grokdebug.herokuapp.com/