Gatsby site deployed on Netlify not updating data from graphcms - graphql

I am a beginner with using Gatsby and graphcms. Fetching data from cms with gatsby development environment is fine, everything good. I have deployed my website on Netlify, when add some new content via cms content is not updating, not fetching.
Component which need content from cms:
import React from "react"
import { StaticQuery, graphql } from "gatsby"
import ServicesMobileProduct from "./ServicesMobileProduct"
const ProductsMobile = () => (
<StaticQuery
query={graphql`
{
product {
products {
id
productName
description
price
amount
}
}
}
`}
render={({ product: { products } }) => (
<>
{products.map(({ productName, description, price, amount, id }) => (
<ServicesMobileProduct
key={id}
productName={productName}
description={description}
price={price}
amount={amount}
/>
))}
</>
)}
/>
)
export default ProductsMobile

Gatsby is a static site generator, this means that in the build/develop time it gathers all data from CMS, markdown, JSON, or other data sources and creates the public HTML output in /public folder. More or less following this simplified schema:
Generally, once the site is built, you need to redeploy it to update, create, or delete content because the site is not updated with these CMS new changes.
What you are trying to achieve is called webhook. A webhook is a way for an application to notify another application when a new event has occurred in real-time such us a creation, deletion, or modification of the content from a source.
In Gatsby, some sources (like DatoCMS) exposes a webhook, but this only works under development mode. Any CMS change will trigger a gatsby develop command to refresh the content. Of course, it's extremely not recommended to upload a site live in gatsby develop mode only to achieve an automated refresh.
In build mode, the idea is quite similar but instead of running a gatsby develop command, you will need to trigger a gatsby build + deploy. If you are using any continuous deployment tool (CD), such as Netlify, you can easily achieve this. If you are using a continuous integration (CI) tool, such as Jenkins, you need to configure a pipeline to achieve it.
Another way to achieve what you want is to create an asynchronous JavaScript request to an external API or data source that populates your application with the content. This will work in any environment but you will lose all the SEO potential (and other) that Gatsby brings.

Related

Calls From External Web Components in PWAs [duplicate]

We are running 2 servers. Server 1 hosts a react application. Server 2 hosts a webcomponent exposed as a single javascript bundle along with some assets such as images. We are dynamically loading the webcomponent Javascript hosted on Server 2 in our react app hosted on Server 1. The fact that it is a webcomponent might or might not affect the issue.
What's happening is that the webcomponent makes uses of assets such as images that are located on Server 2. But when the react app loads the webcomponent, the images are not found as its looking for the images locally on Server 1.
We can fix this in many ways. I am looking for the simplest fix. Since Server 1 app and Server 2 apps are being developed by different teams both should be able to develop in the most natural way possible without making allowances for their app being potentially loaded by other apps.
The fixes that I could think of was:
Making use of absolute URLs to load assets - Need to know the deployed location in advance .
Adding a reverse proxy to Server 1 to redirect to Server 2 whenever a particular path is hit - Need to configure this. The React app could load hundreds of webcomponents, viz we need add a lot of reverse proxies.
Embed all assets into the single javascript on Server 2, like embed svgs into the javascript. - Too limiting. If the SVGs are huge and will make the bundle size bigger.
I was hoping to implement something like -
Since the react app knows where to hit Server 2, can't we write some clever javascript that will make the browser go to Server 2 whenever assets are requested by a Javascript loaded by Server 2.
If you download your Web Component via a classic script (<script> with default type="text/javascript") you can retrieve the URL of the loaded file by using document.currentScript.src.
If you download the file as a module script (<script> with type="module"), you can retrieve the URL by using import.meta.url.
Parse then the property to extract the base path to the Web Component.
Example of Web Component Javascript file:
( function ( path ) {
var base = path.slice( 0, path.lastIndexOf( '/' ) )
customElements.define( 'my-comp', class extends HTMLElement {
constructor() {
super()
this.attachShadow( { mode: 'open' } )
.innerHTML = `<img src="${base}/image.png">`
}
} )
} ) ( document.currentScript ? document.currentScript.src : import.meta.url )
How about uploading all required assets to a 3rd location, or maybe an AWS S3 bucket, Google Drive, Dropbox etc.? That way those assets always have a unique, known URL, and both teams can access them independently. As long as the names remain the same, so will the URLs.

NativeScript adding xml namespace on Page tag

I'm new to NativeScript. I have created a new project using the Angular Blank template. The navigation is done using page-router-outlet. I want to include a xmlns attribute on the page level. As far as i can see and understand the entire code is rendered inside a global page attribute. I've seen that I can modify the page properties by injecting the Page in a component and changing it's properties, but how can I do this for xmlns?
Best regards,
Vlad
To register a UI component in Angular based application you should use registerElement and not XML namespaces (which is a concept used in NativeScript Core). Nowadays most plugin authors are doing this job for you, but still, some of the plugins are not migrated to use the latest techniques so in some cases, we should manually register the UI element.
I've created this test applicaiton which demonstrates how to use nativescript-stripe in Angular. Here are the steps to enable and use the plugin.
Installation
npm i nativescript-stripe --save
Register the UI element in app.module.ts as done here
import { registerElement } from "nativescript-angular/element-registry";
registerElement("CreditCardView", () => require("nativescript-stripe").CreditCardView);
Add the following in main.ts as required in the plugin README
import * as app from "tns-core-modules/application";
import * as platform from "tns-core-modules/platform";
declare const STPPaymentConfiguration;
app.on(app.launchEvent, (args) => {
if (platform.isIOS) {
STPPaymentConfiguration.sharedConfiguration().publishableKey = "yourApiKey";
}
});
Use the plugin in your HTML (example)
<CreditCardView id="card"></CreditCardView>

Failed Prop Type Error in Fine Uploader

I'm trying to get Fine Uploader React to work but keep running into issues.
I'm getting the following errors:
Here's the URL: http://fineuploader.azurewebsites.net/
Here's what I've done so far:
Downloaded the source on to my computer from https://github.com/FineUploader/react-fine-uploader
I then npm installed react-fine-uploader and fine-uploader as per instructions
I ran webpack to transpile and bundle the code
Added an entry point and index.html
Finally, I simply published the app to a new Azure app/website
Any idea what's causing the issue?
P.S. My goal is to use Fine Uploader to upload files to Azure Blob Storage. At this point, I'm simply trying to get Fine Uploader going. I do realize that I'll have to enter a few pieces of information about my blog storage endpoint, etc. but I don't think this error is related to any of that.
A Gallery (and every higher level component of that library) needs an "uploader" props as explained in the section https://github.com/FineUploader/react-fine-uploader#high-level-components
An uploader is one of the 3 classes avaiable in the fine-uploader-wrappers package https://github.com/FineUploader/fine-uploader-wrappers#wrapper-classes
those are for upload to
Aws s3
Azure
or your enpoint
The uploader class need all the configuration endpoint, credentials, custom configuration, etc... (you can find a comprehensive list here in the api section https://docs.fineuploader.com/branch/master/api/options.html)
An example for s3 direct upload would be something like:
const uploader = new FineUploaderS3({
options: {
request: {
endpoint: "http://fineuploadertest.s3.amazonaws.com",
accessKey: "AKIAIXVR6TANOGNBGANQ"
},
signature: {
endpoint: "/vendor/fineuploader/php-s3-server/endpoint.php"
}
}
})
and use that uploader in a gallery
<Gallery uploader={ uploader } />
There are many usefull option for customization: callbacks, onEventHandler, etc you can find them all in the docs of fineuploader
Edit: if im not mistaken react-transition-group is necessary even if it's not listed anywhere in the docs...

Google console fetch & render does display AJAX fetched content

I have published a reactjs website that relies on AJAX requests (POST requests on a graphQL API if that's relevant) to display data.
Using google console fetch & render service, I can see that only my components that do not have to call any API are rendered. Any AJAX based component is not rendered at all.
Google fetch & render does show me 2 rendering pictures of my website (google vs visitor), but both are missing AJAX content.
Is server rendering mandatory in this case ?
I do not have a robots.txt file.
I'm doing something like:
import React, { Component } from 'react';
import { observer } from 'mobx-react';
import { observable, runInAction } from 'mobx';
import axios from 'axios';
import ContributorList from '~/components/global/ContributorList';
import type { User } from '~/types';
import CSSModules from 'react-css-modules';
import styles from './style.less';
#observer
#CSSModules(styles)
export default class Footer extends Component {
#observable contributors: Array<User> = [];
async loadContributors () {
const { data: {
data: data
} } = await axios.post('/', {
query: `
{
users {
id,
name,
slug,
avatar {
secure_url
}
}
}
`
});
runInAction( () => {
this.contributors = data.users;
});
}
componentDidMount () {
this.loadContributors();
}
render () {
return (
<div styleName='contributors'>
{ 'Static content' }
<ContributorList
contributors={ this.contributors }
/>
</div>
);
}
};
In my browser, I can see everything fine ('Static content' + contributors that rare fetched asynchronously). But with google fetch and render, I see 2 screenshots with only 'Static content' displayed.
As a result, my dynamic content does not appear in google search results.
Now all the people are talking about progressive web apps (PWA) which rely on Ajax and rendering the content of the website using client side techniques.
But these techniques are not SEO friendly at all, since Google can't index the classic Ajax request and most of the modern requests.
Since you are using ReactJs, and you want Google to index your website you have to use server side rendering.
This will slow down your website and at the same time Google can crawl and index all your pages, also the user can view it on any device, even the old devices.
When you want to build a PWA or modern web app in general, you have to go back to the basics, for an old technique called graceful degration, which mean if you disabled the JS from your browser still you can see the content and all the functions on your website are work.
This is the same way Google crawl and index your website.
There are many recommendations from Google regarding crawling Ajax content:
Use clean URL structure and avoid using JS links(javascript:void(0);)
Load all the using server side rendering
Make sure that your website is working on all devices(responsive)
Submit XML sitemap
Use canonical links in if you have more than one URL structure (which is not recommended)
We are working on a new project similar to yours, built using reactjs and it will be one of the first PWAs in the world that Google can crawl it like any other website by doing the above points.
Also Google have published an article on their webmaster blog talking about the PWA and any modern web app like yours.
For more details check this link
https://webmasters.googleblog.com/2016/11/building-indexable-progressive-web-apps.html
Regarding the robots.txt file, its recommended to add robots for any website even if you don't have pages to block, but you may want to block the ajax requests and other annoying crawlers.

CRM: What is the difference between update Plugin and Update in XrmServiceToolkit

Is there any difference between updating an entity using a Plugin vs Updating an entity using XrmServiceToolkit?
var entityA= new XrmServiceToolkit.Soap.BusinessEntity("entA", id);
entityA.attributes["attrA"] = { value: attrValue1, type: "OptionSetValue" };
entityA.attributes["attrB"] = { value: attrValue2, type: "Money" };
XrmServiceToolkit.Soap.Update(entityA);
I know plugin can be used to connect to external databases but for a very basic update, is there any difference?
Thank you!
Operations in plugins are seemless integrated with the business logic of your CRM platform. Plugins are invoked in any scenario, regardless if they are triggered by a webpage (Javascript calls, e.g. using XrmServiceToolkit), workflow, external systems, integration tools or even other plugins.
An update done on your web page by Javascript only works on that form. If you only need it there, it's fine. If you need to cover other scenarios as well, you may have to look for another solution.

Resources