Terraform Custom provider - Resource support error - go

I created a custom provider to use an internal API used for generating approved names for resources according to our standard. I wrote it with Go, but when I'm testing it, I'm getting an error saying:
Error: meaningful_resource_name.server_name: Provider doesn't support resource: meaningful_resource_name
Here's the test tf file I'm using:
resource "meaningful_resource_name" "server_name" {
<... parameters ...>
}
output "server_name" {
value = "${meaningful_resource_name.server_name.name}"
}
And this is the provider.go file:
package meaningful
import (
"github.com/hashicorp/terraform/helper/schema"
"github.com/hashicorp/terraform/terraform"
)
func Provider() terraform.ResourceProvider {
return &schema.Provider{
Schema: map[string]*schema.Schema{},
ResourcesMap: map[string]*schema.Resource{
"meaningful_resource_name": meaningfulName(),
},
DataSourcesMap: map[string]*schema.Resource{},
}
}
I already created a provider using this same structure, but I don't understand why it's not recognizing it. Any clues?
I'm using Go v1.10.4 and Terraform v0.11.10

Related

How to watch for external custom resource changes in kubebuilder without importing the external type

Suppose I have bellow code snippet which setups a reconciler that watches external resource "External":
// SetupWithManager sets up the controller with the Manager.
func (r *SomethingReconciler) SetupWithManager(mgr ctrl.Manager) error {
return ctrl.NewControllerManagedBy(mgr).
For(&api.Something{}).
WithOptions(controller.Options{
MaxConcurrentReconciles: stdruntime.NumCPU(),
RecoverPanic: true,
}).
Watches(
&source.Kind{Type: &somev1.External{}},
handler.EnqueueRequestsFromMapFunc(r.findInternalObjectsForExternal),
builder.WithPredicates(predicate.Funcs{
UpdateFunc: func(ue event.UpdateEvent) bool { return true },
DeleteFunc: func(de event.DeleteEvent) bool { return true },
}),
).
Complete(r)
}
My problem is that I can not import somev1.External type into my project because importing the go module containing this type would break my current project's dependencies.
Is there a way in kubebuilder to watch for external resources without having to explicitly importing their types? like GVK or something?

How to use Gatsby code snippet with GraphQL

I'd like to use a GraphQL code snippet in an '.mdx' file:
---
title: Releasing A GitHub Action
date: "2021-03-22T12:35:16"
slug: /blog/releasing-a-github-action
description: "After using other people's GitHub Actions, I thought I'd give one a shot."
---
this is text in the .mdx file
/```graphql
mutation UpdateAllEnvironmentVariablesForSite(
$id: UUID!
$buildEnvironmentVariables: [TagInput!]!
$previewEnvironmentVariables: [TagInput!]!
) {
updateBuildEnvironmentVariablesForSite: updateEnvironmentVariablesForSite(
id: $id
environmentVariables: $buildEnvironmentVariables
runnerType: BUILDS
) {
success
message
}
updatePreviewEnvironmentVariablesForSite: updateEnvironmentVariablesForSite(
id: $id
environmentVariables: $previewEnvironmentVariables
runnerType: PREVIEW
) {
success
message
}
}
/```
Continuing to write .mdx
When I develop this, it looks fine. When I build it in Gatsby Cloud, I get the error
Encountered unknown language 'graphql'. If 'graphql' is an alias for a supported language, use the 'languageAliases' plugin option to map it to the canonical language name.
How do I get around this?
As you can see from the MDX+Gatsby docs:
Note: For now, this only works if the .mdx file exporting the query is
placed in src/pages. Exporting GraphQL queries from .mdx files that
are used for programmatic page creation in gatsby-node.js via
actions.createPage is not currently supported.
A sample working query in a .mdx file looks like:
import { graphql } from "gatsby"
# My Awesome Page
Here's a paragraph, followed by a paragraph with data!
<p>{props.data.site.siteMetadata.description}</p>
export const pageQuery = graphql`
query {
site {
siteMetadata {
description
title
}
}
}
`
In your case, I guess that your snippet won't work even outside the .mdx file in a gatsby build because you are not exporting the query or using a page-query approach. Try adapting your GraphQL query to the "common" way to see how it behaves.

GraphQL Nexus Schema (nexusjs) doesn't compile with scalar types

I am trying to follow the documentation on the Nexus-Schema (nexusjs) website for adding scalar types to my GraphQL application.
I have tried adding many of the different implementations to my src/types/Types.ts file using the samples provided in the documentation and the interactive examples. My attempts include:
Without a 3rd party libraries:
const DateScalar = scalarType({
name: 'Date',
asNexusMethod: 'date',
description: 'Date custom scalar type',
parseValue(value) {
return new Date(value)
},
serialize(value) {
return value.getTime()
},
parseLiteral(ast) {
if (ast.kind === Kind.INT) {
return new Date(ast.value)
}
return null
},
})
With graphql-iso-date 3rd party library:
import { GraphQLDate } from 'graphql-iso-date'
export const DateTime = GraphQLDate
With graphql-scalars 3rd party library (as shown in the ghost example):
export const GQLDate = decorateType(GraphQLDate, {
rootTyping: 'Date',
asNexusMethod: 'date',
})
I am using this new scalar type in an object definition like the following:
const SomeObject = objectType({
name: 'SomeObject',
definition(t) {
t.date('createdAt') // t.date() is supposed to be available because of `asNexusMethod`
},
})
In all cases, these types are exported from the types file and imported into the makeSchema's types property.
import * as types from './types/Types'
console.log("Found types", types)
export const apollo = new ApolloServer({
schema: makeSchema({
types,
...
context:()=>(
...
})
})
The console.log statement above does show that consts declared in the types file are in scope:
Found types {
GQLDate: Date,
...
}
If I run the app in development mode, everything boots up and runs fine.
ts-node-dev --transpile-only ./src/app.ts
However, I encounter errors whenever I try to compile the app to deploy to a server
ts-node ./src/app.ts && tsc
Note: This error occurs occurs running just ts-node ./src/app.ts before it gets to tsc
The errors that shown during the build process are the following:
/Users/user/checkouts/project/node_modules/ts-node/src/index.ts:500
return new TSError(diagnosticText, diagnosticCodes)
^
TSError: тип Unable to compile TypeScript:
src/types/SomeObject.ts:11:7 - error TS2339: Property 'date' does not exist on type 'ObjectDefinitionBlock<"SomeObject">'.
11 t.date('createdAt')
Does anyone have any ideas on either:
a) How can I work around this error? While long-term solutions are ideal, temporary solutions would also be appreciated.
b) Any steps I could follow to debug this error? Or ideas on how get additional information to assist with debugging?
Any assistance would be very much welcomed. Thanks!
The issue seems to be resolved when --transpile-only flag is added to the nexus:reflect command.
This means the reflection command gets updated to:
ts-node --transpile-only ./src/app.ts
and the build comand gets updated to:
env-cmd -f ./config/.env ts-node --transpile-only ./src/app.ts --nexusTypegen && tsc
A github issue has also been created which can be reviewed here: https://github.com/graphql-nexus/schema/issues/690

What is the value of $repositoryLocationName when running ExecutePipeline in Dagster's GraphQL API?

I am attempting to launch a Dagster pipeline run with the GraphQL API. I have Dagit running locally and a working pipeline that I can trigger via the playground.
However, I am now trying to trigger the pipeline via GraphQL Playground, available at /graphql.
I am using the following mutation:
mutation ExecutePipeline(
$repositoryLocationName: String!
$repositoryName: String!
$pipelineName: String!
$runConfigData: RunConfigData!
$mode: String!
)
...and hence am providing the following query params:
{
"repositoryName": "my_repo",
"repositoryLocationName": <???>,
"pipelineName": "my_pipeline",
"mode": "dev",
"runConfigData": {<MY_RUN_CONFIG>}
}
I am not sure what value repositoryLocationName should take? I have tried a few but receive the following error:
{
"data": {
"launchPipelineExecution": {
"__typename": "PipelineNotFoundError"
}
}
}
This is the tutorial I am following.
Short answer:
Each repository lives inside a repository location. Dagster provides a default repository location name if you do not provide one yourself. To find the location name, you can click the repository picker in Dagit, and it'll be next to the repository name:
In this example, the repository name is toys_repository, and the location name is dagster_test.toys.repo
Longer answer:
A workspace (defined with your workspace.yaml) is a collection of repository locations.
There are currently three types of repository locations:
Python file
Python module
gRPC server
Each repository location can have multiple repositories. Once you define the location, Dagster is able to automatically go find all the repositories in that location. In the example above, I defined my workspace to have a single Python module repository location:
load_from:
- python_module: dagster_test.toys.repo
Note that simply specified a module and did not specify a repository location name, so Dagster assigned a default repository location name.
If I wanted to specify a location name, I would do:
load_from:
- python_module:
module_name: dagster_test.toys.repo
location_name: "my_custom_location_name"
Similarly for a python file location:
load_from:
- python_file: repo.py
Or with a custom repository location name:
load_from:
- python_file:
relative_path: repo.py
location_name: "my_custom_location_name"
You can also find out using a GraphQL query. Starting from the example provided in the documentation, you just need to add
repositoryOrigin {
repositoryLocationName
}
resulting in
query PaginatedPipelineRuns {
pipelineRunsOrError {
__typename
... on PipelineRuns {
results {
runId
pipelineName
status
runConfigYaml
repositoryOrigin {
repositoryLocationName
}
stats {
... on PipelineRunStatsSnapshot {
startTime
endTime
stepsFailed
}
}
}
}
}
}
This will return the repository location name for any run that is returned. Trigger the pipeline that you want the location name for in the UI before querying and that run will be your first result.

Go structs to OpenAPI to JSONSchema generation automatically

I have a Go struct for which I want to generate an OpenAPI schema automatically. Once I have an OpenAPI definition of that struct I wanna generate JSONSchema of it, so that I can validate the input data that comes and is gonna be parsed into those structs.
The struct looks like the following:
// mySpec: io.myapp.MinimalPod
type MinimalPod struct {
Name string `json:"name"`
// k8s: io.k8s.kubernetes.pkg.api.v1.PodSpec
v1.PodSpec
}
Above struct is clearly an augmentation of what Kubernetes PodSpec is.
Now the approach that I have used is to generate definition part for my struct MinimalPod, the definition for PodSpec will come from upstream OpenAPI spec of Kubernetes. PodSpec has a key io.k8s.kubernetes.pkg.api.v1.PodSpec in the upstream OpenAPI spec, this definition is injected from there in my Properties. Now in my code that parses above struct I have templates of what to do if struct field is string.
If the field has a comment that starts with k8s: ... the next part is Kubernetes object's OpenAPI definition key. In our case the OpenAPI definition key is io.k8s.kubernetes.pkg.api.v1.PodSpec. So I retrieve that field's definition from the upstream OpenAPI definition and embed it into the definition of my struct.
Once I have generated an OpenAPI definition for this struct which is injected in Kubernetes OpenAPI schema's definition with key being io.myapp.MinimalPod. Now I can use the tool openapi2jsonschema to generate JSONSchema out of this one. Which generates a JSONSchema file named MinimalPod.json.
Now jsonschema tool and the file MinimalPod.json can be used for validating input given to my tool parser to see if all fields were given right.
Is this the right approach of doing things, or is there a tool/library and if I feed Go structs to it, it gives me OpenAPI schema? It would be fine if it does not identify where to inject Kubernetes OpenAPI schema from even automatic parsing of Go structs and giving OpenAPI definition would be much appreciated.
Update 1
After following #mehdy 's instructions, this is what I have tried:
I have used this import path github.com/kedgeproject/kedge/vendor/k8s.io/client-go/pkg/api/v1 to import the PodSpec definition instead of k8s.io/api/core/v1 and code looks like this:
package foomodel
import "github.com/kedgeproject/kedge/vendor/k8s.io/client-go/pkg/api/v1"
// MinimalPod is a minimal pod.
// +k8s:openapi-gen=true
type MinimalPod struct {
Name string `json:"name"`
v1.PodSpec
}
Now when I generate the same with flag -i changed from k8s.io/api/core/v1 to github.com/kedgeproject/kedge/vendor/k8s.io/client-go/pkg/api/v1
$ go run example/openapi-gen/main.go -i k8s.io/kube-openapi/example/model,github.com/kedgeproject/kedge/vendor/k8s.io/client-go/pkg/api/v1 -h example/foomodel/header.txt -p k8s.io/kube-openapi/example/foomodel
This is what is generated:
$ cat openapi_generated.go
// +build !ignore_autogenerated
/*
======
Some random text
======
*/
// This file was autogenerated by openapi-gen. Do not edit it manually!
package foomodel
import (
spec "github.com/go-openapi/spec"
common "k8s.io/kube-openapi/pkg/common"
)
func GetOpenAPIDefinitions(ref common.ReferenceCallback) map[string]common.OpenAPIDefinition {
return map[string]common.OpenAPIDefinition{
"k8s.io/kube-openapi/example/model.Container": {
Schema: spec.Schema{
SchemaProps: spec.SchemaProps{
Description: "Container defines a single application container that you want to run within a pod.",
Properties: map[string]spec.Schema{
"health": {
SchemaProps: spec.SchemaProps{
Description: "One common definitions for 'livenessProbe' and 'readinessProbe' this allows to have only one place to define both probes (if they are the same) Periodic probe of container liveness and readiness. Container will be restarted if the probe fails. Cannot be updated. More info: https://kubernetes.io/docs/concepts/workloads/pods/pod-lifecycle#container-probes",
Ref: ref("k8s.io/client-go/pkg/api/v1.Probe"),
},
},
"Container": {
SchemaProps: spec.SchemaProps{
Ref: ref("k8s.io/client-go/pkg/api/v1.Container"),
},
},
},
Required: []string{"Container"},
},
},
Dependencies: []string{
"k8s.io/client-go/pkg/api/v1.Container", "k8s.io/client-go/pkg/api/v1.Probe"},
},
}
}
I get only this much of the configuration generated. While when I switch back to "k8s.io/api/core/v1" I get config code auto generated which is more than 8k lines. What am I missing here?
Here definition of k8s.io/client-go/pkg/api/v1.Container and k8s.io/client-go/pkg/api/v1.Probe is missing while when I use k8s.io/api/core/v1 as import everything is generated.
Note: To generate above steps, please git clone https://github.com/kedgeproject/kedge in GOPATH.
You can use kube-openapi package for this. I am going to add a sample to the repo but I've tested this simple model:
// Car is a simple car model.
// +k8s:openapi-gen=true
type Car struct {
Color string
Capacity int
// +k8s:openapi-gen=false
HiddenFeature string
}
If you assume you created this file in
go run example/openapi-gen/main.go -h example/model/header.txt -i k8s.io/kube-openapi/example/model -p k8s.io/kube-openapi/example/model
(you also need to add a header.txt file). You should see a new file created in example/model folder called openapi_generated.go. This is an intermediate generated file that has your OpenAPI model in it:
func GetOpenAPIDefinitions(ref common.ReferenceCallback) map[string]common.OpenAPIDefinition {
return map[string]common.OpenAPIDefinition{
"k8s.io/kube-openapi/example/model.Car": {
Schema: spec.Schema{
SchemaProps: spec.SchemaProps{
Description: "Car is a simple car model.",
Properties: map[string]spec.Schema{
"Color": {
SchemaProps: spec.SchemaProps{
Type: []string{"string"},
Format: "",
},
},
"Capacity": {
SchemaProps: spec.SchemaProps{
Type: []string{"integer"},
Format: "int32",
},
},
},
Required: []string{"Color", "Capacity"},
},
},
Dependencies: []string{},
},
}
}
From there you should be able to call the generated method, get the model for your Type and get its Schema.
With some go get magic and changing the command line a little, I was able to generate the model for your model. Here is what you should change in your code:
package model
import "k8s.io/api/core/v1"
// MinimalPod is a minimal pod.
// +k8s:openapi-gen=true
type MinimalPod struct {
Name string `json:"name"`
v1.PodSpec
}
and then change the run command a little to include PodSpec in the generation:
go run example/openapi-gen/main.go -h example/model/header.txt -i k8s.io/kube-openapi/example/model,k8s.io/api/core/v1 -p k8s.io/kube-openapi/example/model
Here is what I got: https://gist.github.com/mbohlool/e399ac2458d12e48cc13081289efc55a

Resources