Stuck building gatsby-starter-netlify-cms - graphql

I recently switched from react static to gatsby & hit a wall. When I install & build gatsby-starter-netlify-cms I get an error:
success open and validate gatsby-configs — 0.049 s
success load plugins — 1.757 s
success onPreInit — 53.736 s
success delete html and css files from previous builds — 0.013 ssuccess initialize cache — 0.725 s
success copy gatsby files — 4.323 s
success onPreBootstrap — 0.261 s
success source and transform nodes — 1.672 s
success building schema — 6.302 s
success createPages — 0.634 s
success createPagesStatefully — 0.525 s
success onPreExtractQueries — 0.211 s
success update schema — 1.565 s
error GraphQL Error Field "image" must not have a selection since type "String" has no subfields.
file: C:/Users/Jason/Dropbox/Documents/Projects/jamamuuga-s-portfolio-gatsby-netlifycms/src/templates/product-page.js
1 |
2 | query ProductPage($id: String!) {
3 | markdownRemark(id: { eq: $id }) {
4 | frontmatter {
5 | title
> 6 | image {
| ^
7 | childImageSharp {
8 | fluid(maxWidth: 2048, quality: 100) {
9 | ...GatsbyImageSharpFluid
10 | }
11 | }
12 | }
13 | heading
14 | description
15 | intro {
16 | blurbs {
error Command failed with exit code 1.
I tried with both yarn & npm seperately to no avail.

You should check your markdown files and find files which have empty values for the field image, or values that point to non existing image. Frontmatter of a markdown file referes to the top of the file which containes some metadata of the file and is surrounded with ---. For example:
---
image:
//my coment: list of some other variables follow, like heading, title....
---

Related

How to get execution times of Azure Function steps?

I have an Azure Function set up as illustrated bellow. I need to understand the execution times for Trigger, Function, and Output steps because even after the function is "warm", first request takes up to 7 seconds. After that the execution time drops to something like 100 ms.
So far I switched the logging level in host.json to
"logging": {
"fileLoggingMode": "always",
"logLevel": {
"default": "Information",
"Host.Results": "Error",
"Function": "Trace",
"Host.Aggregator": "Trace"
}
}
and watched the simple telemetry in live logs:
8:48:07 AM | Trace Request successfully matched the route with name 'main' and template 'api/{*segments}'
8:48:06 AM | Trace Executing 'Functions.main' (Reason='This function was programmatically called via the host APIs.', Id=...)
That's pretty much all I see. Also, when opening the Application - Functions - Function - main log from Visual Studio, the log levels still have [Information] preable.
What I would like to see is basically a duration-time output as in the monitor (from Functions - Main in web portal) section, but split by steps. For example:
date | step | success | result code | duration (ms)
---------------------------------------------------------
.... | trigger | success | 200 | 39
.... | function | success | 200 | 32
.... | output | success | 200 | 37
How to get the duration time for Trigger, Function, and Output steps on every execution?

hh_client reports errors on package

I am following the instruction in Getting started on official Hacklang website.
As it says, I run:
$ touch .hhconfig
$ mkdir bin src tests
$ cat > hh_autoload.json
{
"roots": [
"src/"
],
"devRoots": [
"tests/"
],
"devFailureHandler": "Facebook\\AutoloadMap\\HHClientFallbackHandler"
}
$ composer require hhvm/hsl hhvm/hhvm-autoload
Then I run hh_client, which throws 74 errors like those:
Typing[4110] You cannot use HH_FIXME or HH_IGNORE_ERROR comments to suppress error 4110
--> vendor/autoload.hack
318 | \HH\autoload_set_paths(/* HH_FIXME[4110] incorrect hhi */ $map, Generated\root());
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Typing[4110] Invalid argument
--> vendor/autoload.hack
318 | \HH\autoload_set_paths(/* HH_FIXME[4110] incorrect hhi */ $map, Generated\root());
| ^^^^
--> /private/tmp/hh_server/hhi_3f14b466/functions.hhi
82 | KeyedContainer<string, KeyedContainer<string, string>> $map,
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Expected KeyedContainer<string, string>
--> vendor/hhvm/hhvm-autoload/src/FailureHandler.hack
46 | final public function handleFailure(string $kind, string $name): void {
| ^^^^^^^^^^^^^ But got (function(string $kind, string $name): void)
Naming[2050] You cannot use HH_FIXME or HH_IGNORE_ERROR comments to suppress error 2050
--> vendor/bin/hh-autoload.hack
179 | GenerateScript::main(vec(/* HH_IGNORE_ERROR[2050] */ $GLOBALS['argv']));
| ^^^^^^^^^^^^^^^^^^^^^^^^^^
...
Those errors are related to hh_autoload and hsl package installed through composer.
Why is it???
The Hack typechecker recently (as of HHVM 4.62) changed from 'allow by default' to 'ban by default' for error codes in HH_FIXME comments.
Try adding this to your .hhconfig:
allowed_fixme_codes_strict = 2050, 4110

Snowflake Not Accepting File Format In Bulk Load

I am creating some new ETL tasks for our data pipeline. We have currently have several hundred loading data from various S3 buckets.
So it would go like this:
create or replace stage ETL_STAGE url='s3://bucketname/'
file_format = csv_etl;
create or replace file format csv_etl
type = 'CSV'
field_delimiter = ','
skip_header = 1
FIELD_OPTIONALLY_ENCLOSED_BY='"'
copy into db.schema.table
from #ETL_STAGE/Usage
pattern='/.*[.]csv'
on_error = 'continue'
However, whenever I use this my file format is not only not not escaping the enclosed double quotes it is not even skipping the header so I get this:
Pretty perplexed by this as I am 99% certain the formatting options are correct here.
+-------------------+----------+----------------+---------------------+-------------------+
| "Usage Task Name" | "Value" | "etl_uuid" | "etl_deviceServer" | "etl_timestamp" |
| "taskname" | "0" | "adfasdfasdf" | "hostserverip" | "2020-04-06 2124" |
+-------------------+----------+----------------+---------------------+-------------------+
Run below command by including file_format. This applied the file format while loading file:
copy into db.schema.table
from #ETL_STAGE/Usage
pattern='/.*[.]csv'
on_error = 'continue'
file_format = csv_etl;

Angular2 & typescript get error with import

I'm starting to implement a simple component based on angular 2 , but i get an issue tsconfig.json and import
Here is my structure
Root
|
node_modules
| |
| #angular
| |
| Core
| platform-broswer-dynamic
Script
|
Components
|
MyFirstComponent.ts
MyFirstComponentService.ts
Here is my code
import { bootstrap } from '#angular/platform-browser-dynamic'; // this line is ok
import { Component } from '#angular/core'; // this line is ok
import { FirstService } from 'Root/Script/Components/MyFirstComponentService'; // this line get error
#Component({
selector: 'firstcomponent',
template: '<div>My First Component</div>',
})
export class MyFirstComponent {
constructor(public abc : FirstService)
{
console.log(abc.doSomething());
}
}
bootstrap(MyFirstComponent, [FirstService]);
But i get error at
import { ABCService } from 'Root/Script/Components/MyFirstComponent';
Because some reason i don't want to use import { ABCService } from ./MyFirstComponent';
What config should i use in tsconfig.json to make three import work ? i've tried with rootDir but it not help
I'm using VS2015 , typescript 1.8.32
Thanks you very much!
You do not need to specify the full path, the service and the component are both in the same file location so you will need to use ./ like so:
import { FirstService } from './MyFirstComponentService';
EDIT: Going by your comment, I THINK you're asking this. Say you have another sub folder inside your Root, and another sub folder inside your Components, so you have this now:
Root
|
node_modules
| |
| #angular
| |
| Core
| platform-broswer-dynamic
Script
|
Components
| |
| MyFirstComponent.ts
| MyFirstComponentService.ts
| |
| navbar
| |
| navbar.component.ts
| navbar.component.html
|
Shared
|
authservice.component.ts
if you wanted to access the the navbar.component from inside that same file, you would use:
import { FirstService } from './navbar/navbar.component';
You would would need to specify that from the current folder, ./, go to the navbar folder, then get the component there.
Now if you wanted to access the authservice.component, you would do the following:
import { FirstService } from '../Shared/authservice.component';
The reason for this is that the Shared folder is located one folder higher than the current folder you're in, that's why you would use ../, this essentially takes you one folder "higher".
Does that explain it better? I just added random "common" components. Also maybe consider changing your folder structure, and naming your folder/components as lower case only.
use ./MyFirstComponent in place of that full path.. it work fine with that

use smo to clone azure SQL database?

I'm writing a program to test update scripts for Azure sql.
The idea is to
- first clone a database (or fill a clone with the source schema and content)
- then run the update script on the clone
Locally I have this working, but for azure I have the probem that I don't see any file names. If I restore one database to another on the same azure "server", don't I have to rename the data files during restore too?
For local restore I do this:
restore.Devices.AddDevice(settings.BackupFileName, DeviceType.File);
restore.RelocateFiles.Add(new RelocateFile("<db>", Path.Combine(settings.DataFileDirectory, settings.TestDatabaseName + ".mdf")));
restore.RelocateFiles.Add(new RelocateFile("<db>_log", Path.Combine(settings.DataFileDirectory, settings.TestDatabaseName + "_1.ldf")));
restore.SqlRestore(srv);
Is something similar required for cloning a database on azure?
Lots of Greetings!
Volker
You can create a database as a copy of [source]:
CREATE DATABASE database_name [ COLLATE collation_name ]
| AS COPY OF [source_server_name].source_database_name
{
(<edition_options> [, ...n])
}
<edition_options> ::=
{
MAXSIZE = { 100 MB | 500 MB | 1 | 5 | 10 | 20 | 30 … 150…500 } GB
| EDITION = { 'web' | 'business' | 'basic' | 'standard' | 'premium' }
| SERVICE_OBJECTIVE =
{ 'basic' | 'S0' | 'S1' | 'S2' | 'S3'
| 'P1' | 'P2' | 'P3' | 'P4'| 'P6' | 'P11'
| { ELASTIC_POOL(name = <elastic_pool_name>) } }
}
[;]

Resources