Separate page graphQL queries in separate files (.graphql) - graphql

Is it possible to separate a page graphql query in a separate .graphql file, and then import it in the page file? I assume I need to add some kind og .graphQL loader?

If you want to write your query, mutation with .graphql or .gql file extension.
You need graphql-tag/loader, here is my webpack.config.js:
{
test: /\.(graphql|gql)$/,
exclude: /node_modules/,
use: 'graphql-tag/loader'
}

You can use graphql-import to do that.
Example:
Assume the following directory structure:
.
├── schema.graphql
├── posts.graphql
└── comments.graphql
schema.graphql
# import Query.*, Mutation.* from "posts.graphql"
# import Query.*, Mutation.* from "comments.graphql"
posts.graphql
# import Comment from 'comments.graphql'
type Post {
comments: [Comment]
id: ID!
text: String!
tags: [String]
}
comments.graphql
type Comment {
id: ID!
text: String!
}
You can see full document in here

check 'apollo-universal-starter-kit' - should be adaptable to gatsby

To my knowledge that's not possible out of the box. The only way to split up queries is with Fragments:
https://www.gatsbyjs.org/docs/querying-with-graphql/#fragments
However if you want to get a discussion started I'd recommend opening an issue on GitHub.

Related

Laravel Lighthouse: How can I get graphql-codegen to generate typings from my schema?

I'm using Laravel lighthouse to be my graphql server inside my Laravel app. Now I've added custom queries and types to my graphql/schema.graphql file, I want to be able to get typings in my TypeScript Vue 3 app which is located in the Laravel project. However, when I run the graphql-codegen --config codegen.yml command, it fails because the query being generated doesn't match what's in the schema file.
The query in my graphql/schema.graphql file.
type Query {
posts: [Post!]! #paginate(defaultCount: 10)
}
My Query I call in my Vue component
{
posts {
data {
uuid
body
user {
name
}
}
}
}
The code below is my codegen.yml file
overwrite: true
schema: "./graphql/schema.graphql"
documents: "./resources/js/pages/**/*.vue"
generates:
./resources/js/generated.ts:
plugins:
- "typescript"
- "typescript-operations"
- "typescript-vue-apollo-smart-ops"
Any ideas where I need to point it to or how I get graphql-codegen to generated the correct typings?
Your source schema is transformed by directives such as #paginate. Use the following artisan command to generate a single file that contains your entire and final schema:
php artisan lighthouse:print-schema
See https://lighthouse-php.com/master/api-reference/commands.html#print-schema

How to use Gatsby code snippet with GraphQL

I'd like to use a GraphQL code snippet in an '.mdx' file:
---
title: Releasing A GitHub Action
date: "2021-03-22T12:35:16"
slug: /blog/releasing-a-github-action
description: "After using other people's GitHub Actions, I thought I'd give one a shot."
---
this is text in the .mdx file
/```graphql
mutation UpdateAllEnvironmentVariablesForSite(
$id: UUID!
$buildEnvironmentVariables: [TagInput!]!
$previewEnvironmentVariables: [TagInput!]!
) {
updateBuildEnvironmentVariablesForSite: updateEnvironmentVariablesForSite(
id: $id
environmentVariables: $buildEnvironmentVariables
runnerType: BUILDS
) {
success
message
}
updatePreviewEnvironmentVariablesForSite: updateEnvironmentVariablesForSite(
id: $id
environmentVariables: $previewEnvironmentVariables
runnerType: PREVIEW
) {
success
message
}
}
/```
Continuing to write .mdx
When I develop this, it looks fine. When I build it in Gatsby Cloud, I get the error
Encountered unknown language 'graphql'. If 'graphql' is an alias for a supported language, use the 'languageAliases' plugin option to map it to the canonical language name.
How do I get around this?
As you can see from the MDX+Gatsby docs:
Note: For now, this only works if the .mdx file exporting the query is
placed in src/pages. Exporting GraphQL queries from .mdx files that
are used for programmatic page creation in gatsby-node.js via
actions.createPage is not currently supported.
A sample working query in a .mdx file looks like:
import { graphql } from "gatsby"
# My Awesome Page
Here's a paragraph, followed by a paragraph with data!
<p>{props.data.site.siteMetadata.description}</p>
export const pageQuery = graphql`
query {
site {
siteMetadata {
description
title
}
}
}
`
In your case, I guess that your snippet won't work even outside the .mdx file in a gatsby build because you are not exporting the query or using a page-query approach. Try adapting your GraphQL query to the "common" way to see how it behaves.

Apollo GraphQL Error: Query root type must be provided

I have an Angular/Apollo GraphQL implementation generating typescript code based on GraphQl endpoint which is surfacing a schema. I can hit the endpoint via Postman with a query and results are returned. However, when I run "graphql-codegen --config codegen.yml" via npm I get this error:
"Error: Query root type must be provided"
The server side is .Net Core implementation using GraphQL ASPNetCore. I have 4 different queries defined and each one works via graphiql.
Any ideas on why query root type is now being returned as null?
GraphQL must have at least one #Query() to be considered valid. So maybe only need add any Query to your Resolver code will be helpful.
Ex:
export class FooResolver {
#Query(() => String)
sayHello(): string {
return 'Hello World!';
}
}
This error throws when your schema stiching/definitions are incorrect. Please check the check your root schema definitions
https://www.advancedgraphql.com/content/schema-stitching
I was having the same issue while using graphql-codegen.
my codegen.yml is
overwrite: true
schema: "http://localhost:3001/graphql"
documents: "src/app/graphql/*.graphql"
generates:
src/generated/graphql.ts:
plugins:
- typescript
- typescript-operations
- typescript-apollo-angular
The issue was coming when I used the plugin typescript-apollo-angular.
I'm using Nodejs with graphql as backend.
The issue got resolved when I renamed the type
RootQuery -> Query
and
RootMutation -> Mutation
in backend schema.
Before
type RootQuery {
_empty: String
}
type RootMutation {
_empty: String
}
schema {
query: RootQuery
mutation: RootMutation
}
After
type Query {
_empty: String
}
type Mutation {
_empty: String
}
schema {
query: Query
mutation: Mutation
}
I ended up reverting back to a previous version of the codebase and reapplied my modifications manually and it works now. The only thing I can think of is I ran npm update which updated apollo-angular from 1.8.3 to 1.10.0.
EDIT Here is my code:
codegen.yml (used to generate code from npm command):
overwrite: true
schema: "https://to_end_point/Prod/api/v1/GraphQL"
documents: "src/**/*.graphql"
generates:
src/generated/graphql.ts:
plugins:
- "typescript"
- "typescript-operations"
- "typescript-apollo-angular"
./graphql.schema.json:
plugins:
- "introspection"
After reverting back to a previous version of Angular code then re-applying my code modifications, GraphQl code generation worked again. The only thing I can think of which could have caused this issue was when I ran npm update. Below is a screenshot of before/after of package.json:

gatsby.js - advanced starter - Implement 2 url prefixes (2 different sections of site)?

It looks as though if you put your jsx files in the 'pages' folder of most gatsby starters, the urls follow the directory structure out of the box, so you can implement whatever urls you need (http://blah.com/foo/post1, http://blah.com/bar/post2) just by nesting folders in the source tree (pages/foo/post.jsx, pages/bar/post2.jsx).
The issue
I used the gatsby advanced starter (https://github.com/Vagr9K/gatsby-advanced-starter). It puts all content files not in pages/, but in a top-level content/ folder and I can't figure out the wiring to replicate foo/xxx, bar/xxx urls even after creating content/foo/post1.md, content/bar/post2.md folders.
It does have a siteconfig.js that sets a single path prefix, but I want two different prefixes for the 2 different sections of the site, so I just set it to "/" for now (builds seem to ignore whatever value I set for this config param anyway, so... shrug).
What I tried
I tried adding path to the frontmatter of the .md files and set it to the parent foldername. This was completely ignored (in any case I don't think that's what I want... I'd like to keep the generated slug as part of the url).
Separated use of gatsby-source-filesystem one for each subfolder hoping it would change graphql graph to recognize 2 separate data sources but it made no difference.
What am I doing wrong?
It looks as though if you put your jsx files in the 'pages' folder of most gatsby starters, the urls follow the directory structure out of the box [...]
That's not specific to Gatsby starters, that's Gatsby's default behaviour. Every js/jsx file in src/pages will be a page.
but in a top-level content/ folder
It still has the src/pages folder for normal pages. But the content folder holds the files will be transformed with the src/templates in gatsby-node.js to pages. Or in other words: The contents of the content folder get programmatically created with the defined template in gatsby-node.js (and the template lies in src/templates).
The path/url get's defined in the createPage function here: gatsby-nodeL144. This line is referencing the edge.node.fields.slug which gets queried in the GraphQL above here. The field gets added in the onCreateNode function. More specificially the slug field in the onCreateNodeField function. There you see that it gets passed a slug that gets defined above.
Create two folders in your src/content folder, e.g. blog and projects. Make sure that you have both of them defined in your gatsby-config.js:
{
resolve: 'gatsby-source-filesystem',
options: {
name: 'blog',
path: `${__dirname}/content/blog`,
},
},
{
resolve: 'gatsby-source-filesystem',
options: {
name: 'projects',
path: `${__dirname}/content/projects`,
},
},
In your gatsby-node.js add after the fileNode definition the line:
const pathPrefix = fileNode.sourceInstanceName
The sourceInstanceName is that what we defined as the name in gatsby-config entries.
Then you can alter the line to:
createNodeField({ node, name: "slug", value: `/${pathPrefix}${slug}` });
createNodeField({ node, name: 'sourceInstanceName', value: pathPrefix });
The second line is helpful if you then want to query only for one of the two folders, e.g.:
export const pageQuery = graphql`
query BlogQuery {
allMarkdownRemark(filter: { fields: { sourceInstanceName: { eq: "blog } } }
) {
edges {
node {
... etc
}
}
}
}
`

Go structs to OpenAPI to JSONSchema generation automatically

I have a Go struct for which I want to generate an OpenAPI schema automatically. Once I have an OpenAPI definition of that struct I wanna generate JSONSchema of it, so that I can validate the input data that comes and is gonna be parsed into those structs.
The struct looks like the following:
// mySpec: io.myapp.MinimalPod
type MinimalPod struct {
Name string `json:"name"`
// k8s: io.k8s.kubernetes.pkg.api.v1.PodSpec
v1.PodSpec
}
Above struct is clearly an augmentation of what Kubernetes PodSpec is.
Now the approach that I have used is to generate definition part for my struct MinimalPod, the definition for PodSpec will come from upstream OpenAPI spec of Kubernetes. PodSpec has a key io.k8s.kubernetes.pkg.api.v1.PodSpec in the upstream OpenAPI spec, this definition is injected from there in my Properties. Now in my code that parses above struct I have templates of what to do if struct field is string.
If the field has a comment that starts with k8s: ... the next part is Kubernetes object's OpenAPI definition key. In our case the OpenAPI definition key is io.k8s.kubernetes.pkg.api.v1.PodSpec. So I retrieve that field's definition from the upstream OpenAPI definition and embed it into the definition of my struct.
Once I have generated an OpenAPI definition for this struct which is injected in Kubernetes OpenAPI schema's definition with key being io.myapp.MinimalPod. Now I can use the tool openapi2jsonschema to generate JSONSchema out of this one. Which generates a JSONSchema file named MinimalPod.json.
Now jsonschema tool and the file MinimalPod.json can be used for validating input given to my tool parser to see if all fields were given right.
Is this the right approach of doing things, or is there a tool/library and if I feed Go structs to it, it gives me OpenAPI schema? It would be fine if it does not identify where to inject Kubernetes OpenAPI schema from even automatic parsing of Go structs and giving OpenAPI definition would be much appreciated.
Update 1
After following #mehdy 's instructions, this is what I have tried:
I have used this import path github.com/kedgeproject/kedge/vendor/k8s.io/client-go/pkg/api/v1 to import the PodSpec definition instead of k8s.io/api/core/v1 and code looks like this:
package foomodel
import "github.com/kedgeproject/kedge/vendor/k8s.io/client-go/pkg/api/v1"
// MinimalPod is a minimal pod.
// +k8s:openapi-gen=true
type MinimalPod struct {
Name string `json:"name"`
v1.PodSpec
}
Now when I generate the same with flag -i changed from k8s.io/api/core/v1 to github.com/kedgeproject/kedge/vendor/k8s.io/client-go/pkg/api/v1
$ go run example/openapi-gen/main.go -i k8s.io/kube-openapi/example/model,github.com/kedgeproject/kedge/vendor/k8s.io/client-go/pkg/api/v1 -h example/foomodel/header.txt -p k8s.io/kube-openapi/example/foomodel
This is what is generated:
$ cat openapi_generated.go
// +build !ignore_autogenerated
/*
======
Some random text
======
*/
// This file was autogenerated by openapi-gen. Do not edit it manually!
package foomodel
import (
spec "github.com/go-openapi/spec"
common "k8s.io/kube-openapi/pkg/common"
)
func GetOpenAPIDefinitions(ref common.ReferenceCallback) map[string]common.OpenAPIDefinition {
return map[string]common.OpenAPIDefinition{
"k8s.io/kube-openapi/example/model.Container": {
Schema: spec.Schema{
SchemaProps: spec.SchemaProps{
Description: "Container defines a single application container that you want to run within a pod.",
Properties: map[string]spec.Schema{
"health": {
SchemaProps: spec.SchemaProps{
Description: "One common definitions for 'livenessProbe' and 'readinessProbe' this allows to have only one place to define both probes (if they are the same) Periodic probe of container liveness and readiness. Container will be restarted if the probe fails. Cannot be updated. More info: https://kubernetes.io/docs/concepts/workloads/pods/pod-lifecycle#container-probes",
Ref: ref("k8s.io/client-go/pkg/api/v1.Probe"),
},
},
"Container": {
SchemaProps: spec.SchemaProps{
Ref: ref("k8s.io/client-go/pkg/api/v1.Container"),
},
},
},
Required: []string{"Container"},
},
},
Dependencies: []string{
"k8s.io/client-go/pkg/api/v1.Container", "k8s.io/client-go/pkg/api/v1.Probe"},
},
}
}
I get only this much of the configuration generated. While when I switch back to "k8s.io/api/core/v1" I get config code auto generated which is more than 8k lines. What am I missing here?
Here definition of k8s.io/client-go/pkg/api/v1.Container and k8s.io/client-go/pkg/api/v1.Probe is missing while when I use k8s.io/api/core/v1 as import everything is generated.
Note: To generate above steps, please git clone https://github.com/kedgeproject/kedge in GOPATH.
You can use kube-openapi package for this. I am going to add a sample to the repo but I've tested this simple model:
// Car is a simple car model.
// +k8s:openapi-gen=true
type Car struct {
Color string
Capacity int
// +k8s:openapi-gen=false
HiddenFeature string
}
If you assume you created this file in
go run example/openapi-gen/main.go -h example/model/header.txt -i k8s.io/kube-openapi/example/model -p k8s.io/kube-openapi/example/model
(you also need to add a header.txt file). You should see a new file created in example/model folder called openapi_generated.go. This is an intermediate generated file that has your OpenAPI model in it:
func GetOpenAPIDefinitions(ref common.ReferenceCallback) map[string]common.OpenAPIDefinition {
return map[string]common.OpenAPIDefinition{
"k8s.io/kube-openapi/example/model.Car": {
Schema: spec.Schema{
SchemaProps: spec.SchemaProps{
Description: "Car is a simple car model.",
Properties: map[string]spec.Schema{
"Color": {
SchemaProps: spec.SchemaProps{
Type: []string{"string"},
Format: "",
},
},
"Capacity": {
SchemaProps: spec.SchemaProps{
Type: []string{"integer"},
Format: "int32",
},
},
},
Required: []string{"Color", "Capacity"},
},
},
Dependencies: []string{},
},
}
}
From there you should be able to call the generated method, get the model for your Type and get its Schema.
With some go get magic and changing the command line a little, I was able to generate the model for your model. Here is what you should change in your code:
package model
import "k8s.io/api/core/v1"
// MinimalPod is a minimal pod.
// +k8s:openapi-gen=true
type MinimalPod struct {
Name string `json:"name"`
v1.PodSpec
}
and then change the run command a little to include PodSpec in the generation:
go run example/openapi-gen/main.go -h example/model/header.txt -i k8s.io/kube-openapi/example/model,k8s.io/api/core/v1 -p k8s.io/kube-openapi/example/model
Here is what I got: https://gist.github.com/mbohlool/e399ac2458d12e48cc13081289efc55a

Resources