Using Gatsby Background with Multiple Images - graphql

I am trying to use gatsby-background-image with multiple images but I can't get it to work.
I have the following
const {backgroundImages} = useStaticQuery(graphql`
query {
backgroundImages: allFile(
filter: {extension: {regex: "/(png)/"}, relativeDirectory: {eq: "slider"}}
) {
edges {
node {
base
childImageSharp {
gatsbyImageData(width: 10, quality: 10, webpOptions: {quality: 70})
}
}
}
}
}
`)
const image = getImage(backgroundImages.edges[0].node.childImageSharp)
const bg = convertToBgImage(image)
This is how I am using the BackgroundImage component
<BackgroundImage Tag="section" {...bg} preserveStackingContext className={styles.bgImg}>
{props.children}
</BackgroundImage>
I've tried accessing the gatsbyImageData property of the object but that didn't work either.

You are only getting the first image (node) in:
const image = getImage(backgroundImages.edges[0].node.childImageSharp)
If you use a loop with:
const images = [];
backgroundImages.edges.map(backgroundImage => images.push(backgroundImage.node.childImageSharp)
Once you've stored your nodes, you just need to print it like:
images.map(image => {
let bg= convertToBgImage(image);
return <BackgroundImage Tag="section" {...bg} preserveStackingContext className={styles.bgImg}>
{props.children}
</BackgroundImage>
})

Related

Gatsby Contentful embedded image

As I see there is no json option anymore when querying the contentfulBlogPost only raw. I was able to make some changes to get everything from the body, except the image from that post.
If I made a test in GraphQL Playground I can get the image id and url but that's it.
query {
allContentfulAsset {
edges {
node {
id
file {
url
}
}
}
}
}
I tried to find an example how to get embedded images but no luck....
import React from 'react'
import { graphql } from 'gatsby'
import { documentToReactComponents } from '#contentful/rich-text-react-renderer'
import Layout from '../components/layout'
export const query = graphql`
query($slug: String!) {
contentfulBlogPost(slug: {eq: $slug}) {
title
publishedDate(formatString: "MMMM Do, YYYY")
body {
raw
}
}
allContentfulAsset {
edges {
node {
id
file {
url
}
}
}
}
}
`
const Blog = (props) => {
const options = {
renderNode: {
"embedded-asset-block": (node) => {
const alt = node.data.title
const url = node.file.url
return <img alt={alt} src={url}/>
}
}
}
return (
<Layout>
<h1>{props.data.contentfulBlogPost.title}</h1>
<p>{props.data.contentfulBlogPost.publishedDate}</p>
{documentToReactComponents(JSON.parse(props.data.contentfulBlogPost.body.raw, options))}
</Layout>
)
}
export default Blog
Plugins:
...
'gatsby-plugin-sharp',
{
resolve: 'gatsby-transformer-remark',
options: {
plugins: [
'gatsby-remark-relative-images',
{
resolve: 'gatsby-remark-images-contentful',
options: {
maxWidth: 750,
linkImagesToOriginal: false
}
}
]
}
}
],
}
Hi I saw this solution in a Youtube comment. First thing you have to do is change your Graphql query to something like this:
query ($slug: String!) {
contentfulBlogPost(slug: {eq: $slug}) {
id
title
publishedDate(formatString: "MMMM Do, YYYY")
body {
raw
references {
... on ContentfulAsset {
contentful_id
title
file {
url
}
}
}
}
}
}
Then change your options constant to:
const options = {
renderNode: {
[BLOCKS.EMBEDDED_ASSET]: node => {
console.log(node);
const imageID = node.data.target.sys.id;
const {
file: {url},
title
} = props.data.contentfulBlogPost.body.references.find(({contentful_id: id}) => id === imageID);
return <img src={url} alt={title} />
}
}
}
Use something like:
import { BLOCKS, MARKS } from "#contentful/rich-text-types"
import { renderRichText } from "gatsby-source-contentful/rich-text"
​
const Bold = ({ children }) => <span className="bold">{children}</span>
const Text = ({ children }) => <p className="align-center">{children}</p>
​
const options = {
renderMark: {
[MARKS.BOLD]: text => <Bold>{text}</Bold>,
},
renderNode: {
[BLOCKS.PARAGRAPH]: (node, children) => <Text>{children}</Text>,
[BLOCKS.EMBEDDED_ASSET]: node => {
return (
<>
<h2>Embedded Asset</h2>
<pre>
<code>{JSON.stringify(node, null, 2)}</code>
</pre>
</>
)
},
},
}
​
renderRichText(node.bodyRichText, options)
Source: https://www.contentful.com/developers/docs/tutorials/general/rich-text-and-gatsby/
The return statement in BLOCKS.EMBEDDED_ASSET entry will contain your data, adapt to your needs. If you go inside the dependency, you'll see all the exposed methods, so you will have also a BLOCKS.EMBEDDED_ENTRY entry for your embedded entries. Apply it like:
[BLOCKS.EMBEDDED_ENTRY]: node => {
// your logic to manipulate the entry here
return (
<>
<div>whatever</div>
</>
)
},
For anyone that is still struggling to find the "references" field in graphql, remember that you HAVE TO first create an entry in contentful by adding at least one image. Otherwise, the references field will not show up in graphql, hence you can not query it.

Showing a list of images with Gatsby Image and Graphql

I have been working with Gatsby and GraphQL on a project for a bit now. This is my first time using this platform so still figuring out the right way to go about it.
Right now, I have an index.js which loads a list of users from a GraphQL query such as
allUserList {
users {
name
imageUrl
}
}
}
imageUrl is expected to send the correct file path i.e images/image.png
This is sent directly to a react component which is used to display the user info
const Users = {data} => {
return (
data.users.map(user => {
return (
<User name={user.name} imageUrl={user.imageUrl}/>
)
}
)
}
What I am trying to figure out now is, imageUrl contains the relative url to the image in the codebase. I would be required to run a image query on this url to get the image object which I can then pass to gatsby image. Something like this
{
image: file(relativePath: { eq: 'image.png' }) {
childImageSharp {
fixed(width: 160, height: 160) {
...GatsbyImageSharpFixed
}
}
}
}
However as I understand it isnt possible to parameterize the image url here. What is the best way to go about this? Is this even the correct approach to show a list with images in this manner?
However as I understand it isnt possible to parameterize the image url here.
Correct. Gatsby runs GraphQL queries ONCE at build time and then never again until you gatsby develop or gatsby build again.
Two approaches:
First approach: You name your images after user.name. You can then query all your images and filter for each user by file name using the graphQL originalName attribute:
const UserSupplier = () => {
const { allFile } = useStaticQuery(graphql`
query {
allFile(filter: {
extension: {regex: "/(jpg)|(jpeg)|(png)/"},
sourceInstanceName: {eq: "user"}})
{
edges {
node {
childImageSharp {
fluid(maxWidth: 150, quality: 100) {
originalName
...GatsbyImageSharpFluid
}
}
}
}
}
}`);
const user = users.filter(el => el.userId === userId)[0];
const { fluid } = user .node.childImageSharp;
return (
<UserComponent>
<GatsbyImage fluid={fluid} alt={fluid.originalName} />
</UserComponent>
);
};
Second approach: Query for your childImageSharp image object in your allUserList query. You can pass the image as props to the component that renders the image. That would make your query result query quite data intensive depending on how many users there are.
allUserList {
users {
name
imageUrl
childImageSharp { // the query path is probably wrong since I don't know your project
fixed(width: 160, height: 160) {
...GatsbyImageSharpFixed
}
}
}
}
}

How to get Gatsby Images based on results of PageQuery?

I'd like to do something like the following so I can get Gatsby Images dynamically:
const image = 'gastby-astronaut.png';
export const imageQuery = graphql`
{ allImageSharp (
filter: {
fluid: {
originalName: {
regex: "/${image}/"
}
}
}
){
edges {
node {
fluid {
originalName
}
}
}
}
}
`;
However, I can't figure out how to connect this query to an initial query that would get the 'gatsby-astronaut.png', or perform this query from a subcomponent with a . I get this error when I try this:
Error: BabelPluginRemoveGraphQL: String interpolations are not allowed
in graphql fragments. Included fragments should be referenced as
`...MyModule_foo`.
Any suggestions on the proper way to return Gatsby Images dynamically?
Ah, yeah Gatsby extracts GraphQL queries from your pages through static analysis: they load the file as text, parse it, and extract the query, all before the actual file gets executed. This means that your typical tagged-template literal functionality isn't there.
The only way to filter is through context provided when createPage is called from gatsby-node.js. I.e. if you do this:
exports.createPages = ({ graphql, actions }) =>
graphql(`some query here`).then(result => {
actions.createPage({
path: "/output-path/",
component: path.resolve(`./src/templates/your_template.jsx`),
context: { image: result.data.yourImage },
})
})
Then you can do this in your page query:
query SomePage($image: String!) {
allImageSharp (
filter: {
fluid: {
originalName: {
regex: $image
}
}
}
){
edges {
node {
fluid {
originalName
}
}
}
}
}
Here's a solution I came up with... pretty janky, but it works:
import PropTypes from 'prop-types';
import React from 'react';
import Img from 'gatsby-image';
import { useStaticQuery, graphql } from 'gatsby';
const Image = ({ imageYouWant }) => {
const data = useStaticQuery(
graphql`
query allTheImagesQuery{
allImageSharp {
edges {
node {
fluid(maxWidth:1000) {
...GatsbyImageSharpFluid
originalName
}
}
}
}
}`,
);
const TheImageYouWant = data.allImageSharp.edges
.filter(edge => edge.node.fluid.originalName === imageYouWant)
.map(myImage => <Img fluid={myImage.node.fluid} />);
return (
<>
{ TheImageYouWant }
</>
);
};
Image.propTypes = {
imageYouWant: PropTypes.string,
};
Image.defaultProps = {
imageYouWant: '',
};
export default Image;

Add ImageSharp as a field to MarkdownRemark nodes (not frontmatter)

I have the following graphQL query I'm trying to get working:
{
allMarkdownRemark(
limit: 1000
) {
edges {
node {
id
parent {
id
}
fields{
slug
hero {
childImageSharp {
fixed {
src
}
}
}
}
frontmatter {
template
}
}
}
}
}
The hero field currently returns a path to an image using the following code:
exports.onCreateNode = ({ node, actions, getNode }) => {
const { createNodeField } = actions
// Add slug to MarkdownRemark node
if (node.internal.type === 'MarkdownRemark') {
const value = createFilePath({ node, getNode, basePath: 'library' })
const { dir } = getNode(node.parent)
const getHero = (d) => {
let hero = `${__dirname}/src/images/no-hero.gif`
if (fs.existsSync(`${d}/hero.jpg`)) hero = `${d}/hero.jpg`
if (fs.existsSync(`${d}/hero.png`)) hero = `${d}/hero.png`
if (fs.existsSync(`${d}/hero.gif`)) hero = `${d}/hero.gif`
return hero
}
createNodeField({
node,
name: 'slug',
value,
})
createNodeField({
node,
name: 'hero',
value: getHero(dir),
})
}
}
I've seen other people do something similar with an image path in the frontmatter but I don't want to have to use the frontmatter when it's easy enough to get graphql to see the file path without having to specify it.
However when I try the above I get the following error:
Field \"hero\" must not have a selection since type \"String\" has no
subfields.
Is there a way I can get childImageSharp to recognize this field?
I'm back again to (hopefully) settle this issue once and for all (see our history here).
This time, we'll attach the hero image's ImageSharp to the MarkdownRemark node. Your approach is correct, with 1 caveat: Gatsby seems to only recognize relative paths, i.e path starting with a dot.
You can fix this easily in your code:
const getHero = (d) => {
let hero = `${__dirname}/src/images/no-hero.gif`
- if (fs.existsSync(`${d}/hero.jpg`)) hero = `${d}/hero.jpg`
- if (fs.existsSync(`${d}/hero.png`)) hero = `${d}/hero.png`
- if (fs.existsSync(`${d}/hero.gif`)) hero = `${d}/hero.gif`
+ if (fs.existsSync(`${d}/hero.jpg`)) hero = `./hero.jpg`
+ if (fs.existsSync(`${d}/hero.png`)) hero = `./hero.png`
+ if (fs.existsSync(`${d}/hero.gif`)) hero = `./hero.gif`
return hero
}
createNodeField({
node,
name: 'hero',
value: getHero(dir),
})
This should work, though I want to provide an alternative hero search function. We can get a list of files in dir with fs.readdir, then find a file with the name 'hero':
exports.onCreateNode = async ({
node, actions,
}) => {
const { createNodeField } = actions
if (node.internal.type === 'MarkdownRemark') {
const { dir } = path.parse(node.fileAbsolutePath)
const heroImage = await new Promise((res, rej) => {
// get a list of files in `dir`
fs.readdir(dir, (err, files) => {
if (err) rej(err)
// if there's a file named `hero`, return it
res(files.find(file => file.includes('hero')))
})
})
// path.relative will return a (surprise!) a relative path from arg 1 to arg 2.
// you can use this to set up your default hero
const heroPath = heroImage
? `./${heroImage}`
: path.relative(dir, 'src/images/default-hero.jpg')
// create a node with relative path
createNodeField({
node,
name: 'hero',
value: `./${heroImage}`,
})
}
}
This way we don't care what the hero image's extension is, as long as it exists. I use String.prototype.includes, but you might want to use regex to pass in a list of allowed extensions, to be safe, like /hero.(png|jpg|gif|svg)/. (I think your solution is more readable, but I prefer to access the file system only once per node.)
You can also use path.relative to find the relative path to a default hero image.
Now, this graphql query works:
A (Minor) Problem
However, there's a minor problem with this approach: it breaks graphql filter type! When I try to query and filter based on hero, I get this error:
Perhaps Gatsby forgot to re-infer the type of hero, so instead of being a File, it is still a String. This is annoying if you need the filter to work.
Here's a workaround: Instead of asking Gatsby to link the file, we'll do it ourselves.
exports.onCreateNode = async ({
node, actions, getNode, getNodesByType,
}) => {
const { createNodeField } = actions
// Add slug to MarkdownRemark node
if (node.internal.type === 'MarkdownRemark') {
const { dir } = path.parse(node.fileAbsolutePath)
const heroImage = await new Promise((res, rej) => {
fs.readdir(dir, (err, files) => {
if (err) rej(err)
res(files.find(file => file.includes('hero')))
})
})
// substitute with a default image if there's no hero image
const heroPath = heroImage ? path.join(dir, heroImage) : path.resolve(__dirname, 'src/images/default-hero.jpg')
// get all file nodes
const fileNodes = getNodesByType('File')
// find the hero image's node
const heroNode = fileNodes.find(fileNode => fileNode.absolutePath === heroPath)
createNodeField({
node,
name: 'hero___NODE',
value: heroNode.id,
})
}
}
And now we can filter the hero field again:
If you don't need to filter content by hero image though, letting gatsby handle node type is much preferable.
Let me know if you run into issues trying this.

Spot the difference between these two images

Programmatically, my code is detecting a difference between two classes of images, and always rejecting one class, while always allowing the other.
I have yet to find any difference between the images that yield the error and the ones that don't an yield error. But there has to be some difference, because the ones that yield an error do so 100% of the time, and the others work as expected 100% of the time.
In particular, I have inspected color format: RGB in both groups; size: no notable difference; datatype: uint8 in both; magnitude of pixel values: similar in both.
Below are two images that never work, followed by two images that always work:
This image never works: https://www.colourbox.com/preview/11906131-maple-tree-and-grass-silhouette.jpg
This image never works: http://feldmanphoto.com/wp-content/uploads/awe-inspiring-house-clipart-black-and-white-disney-coloring-pages-big-clipartxtras-illistration-background-housewives-bouncy.jpeg
This image always works: http://www.spacedesign.us/wp-content/uploads/landscape-with-old-tree-and-grass-over-white-background-black-and-black-and-white-trees.jpg
This image always works: http://www.modernhouse.co/wp-content/uploads/2017/07/1024px-RoseSeidlerHouseSulmanPrize.jpg
How can I spot the difference?
The scenario is that I am using Firebase with Swift iOS front end to send these images to a Google Cloud ML-engine hosted convnet. Some images work all the time and certain others never work as above. Further, all images work when I use the gcloud versions predict CLI. To me the issue is necessarily something in the images. Hence I am posting here for the imaging group. Code is included as requested for completeness.
CODE of index.js file is included:
'use strict';
const functions = require('firebase-functions');
const gcs = require('#google-cloud/storage');
const admin = require('firebase-admin');
const exec = require('child_process').exec;
const path = require('path');
const fs = require('fs');
const google = require('googleapis');
const sizeOf = require('image-size');
admin.initializeApp(functions.config().firebase);
const db = admin.firestore();
const rtdb = admin.database();
const dbRef = rtdb.ref();
function cmlePredict(b64img) {
return new Promise((resolve, reject) => {
google.auth.getApplicationDefault(function (err, authClient) {
if (err) {
reject(err);
}
if (authClient.createScopedRequired && authClient.createScopedRequired()) {
authClient = authClient.createScoped([
'https://www.googleapis.com/auth/cloud-platform'
]);
}
var ml = google.ml({
version: 'v1'
});
const params = {
auth: authClient,
name: 'projects/myproject-18865/models/my_model',
resource: {
instances: [
{
"image_bytes": {
"b64": b64img
}
}
]
}
};
ml.projects.predict(params, (err, result) => {
if (err) {
reject(err);
} else {
resolve(result);
}
});
});
});
}
function resizeImg(filepath) {
return new Promise((resolve, reject) => {
exec(`convert ${filepath} -resize 224x ${filepath}`, (err) => {
if (err) {
console.error('Failed to resize image', err);
reject(err);
} else {
console.log('resized image successfully');
resolve(filepath);
}
});
});
}
exports.runPrediction = functions.storage.object().onChange((event) => {
fs.rmdir('./tmp/', (err) => {
if (err) {
console.log('error deleting tmp/ dir');
}
});
const object = event.data;
const fileBucket = object.bucket;
const filePath = object.name;
const bucket = gcs().bucket(fileBucket);
const fileName = path.basename(filePath);
const file = bucket.file(filePath);
if (filePath.startsWith('images/')) {
const destination = '/tmp/' + fileName;
console.log('got a new image', filePath);
return file.download({
destination: destination
}).then(() => {
if(sizeOf(destination).width > 224) {
console.log('scaling image down...');
return resizeImg(destination);
} else {
return destination;
}
}).then(() => {
console.log('base64 encoding image...');
let bitmap = fs.readFileSync(destination);
return new Buffer(bitmap).toString('base64');
}).then((b64string) => {
console.log('sending image to CMLE...');
return cmlePredict(b64string);
}).then((result) => {
console.log(`results just returned and is: ${result}`);
let predict_proba = result.predictions[0]
const res_pred_val = Object.keys(predict_proba).map(k => predict_proba[k])
const res_val = Object.keys(result).map(k => result[k])
const class_proba = [1-res_pred_val,res_pred_val]
const opera_proba_init = 1-res_pred_val
const capitol_proba_init = res_pred_val-0
// convert fraction double to percentage int
let opera_proba = (Math.floor((opera_proba_init.toFixed(2))*100))|0
let capitol_proba = (Math.floor((capitol_proba_init.toFixed(2))*100))|0
let feature_list = ["houses", "trees"]
let outlinedImgPath = '';
let imageRef = db.collection('predicted_images').doc(filePath.slice(7));
outlinedImgPath = `outlined_img/${filePath.slice(7)}`;
imageRef.set({
image_path: outlinedImgPath,
opera_proba: opera_proba,
capitol_proba: capitol_proba
});
let predRef = dbRef.child("prediction_categories");
let arrayRef = dbRef.child("prediction_array");
predRef.set({
opera_proba: opera_proba,
capitol_proba: capitol_proba,
});
arrayRef.set({first: {
array_proba: [opera_proba,capitol_proba],
brief_description: ["a","b"],
more_details: ["aaaa","bbbb"],
feature_list: feature_list},
zummy1: "",
zummy2: ""});
return bucket.upload(destination, {destination: outlinedImgPath});
});
} else {
return 'not a new image';
}
});
Issue was that the bad images were grayscale, not RGB as expected by my model. I initially had checked this first by looking at the shape. But the 'bad' images had 3 color channels, each of those 3 channels stored the same number --- so my model was refusing to accept them. Also, as expected and contrary to what I initially thought I observed, turns out the gcloud ML-engine predict CLI actually also failed for these images. Took me 2 days to figure this out!

Resources