When trying to integrate one website with another what is the way to go? Iframe or pulling content? - ajax

My company has multiple vendors that all have their own websites. I am creating a website that acts as a dashboard where customers can access all of the vendor's sites. I wanted to know what is the best option for doing this?
Here's what I have so far:
Iframe
Can bring in the entire website
Seems secure enough (not sure if I'm missing any information on security issues for this)
Users can interact with the vendor's website through our site
Our website cannot fully interact with the vendor's website (Also may be missing info here)
Pulling in the content
Can bring in the entire website
Not very secure from what I hear (Some websites actually say that pulling another website in is a voilation of security and will alert the user of this or something similar...
Users can interact with their website through our site
Our website can fully interact with the vendor's website
Anyone have any other options...?
What are some of the downsides to bringing in a site with an iframe and is this really our only option for doing something like this?
Optimally, we would like to pull in their site to ours without using an iframe- What options do we have on this level? Is there anything better than an iframe?
Please add in as much information as you can about iframes, pulling content, security, and website interactions like this. Anything to add in is appreciated.
Thanks,
Matt

As far as "pulling content" is concerned I wouldn't advise it as it can break. All it takes is a simple HTML change on their end and your bot will break. Also, it's more work than you think to do this for one site, let alone the many that you speak of. However, there are 3rd party apps that can do this for you if you have the budget.
You could use an iframe/frames, however, many sites might try to bust out of them and it can ruin the user experience of the site within the frame.
My advice is to use the following HTML for each link in your dashboard.
Vendor Site Link

If you can have the sites that you are embedding add some client-side script, then you could use easyXSS. It allows for easy transferring of data, and also calling javascript methods across the domain boundry.

I would recommend iFrames. Whilst not the most glamorous of elements, many payment service providers use iFrames for the Verified by Visa/Mastercard Secure Code integration.

Related

How to apply pagespeed insights results

Well, I trying establish a web page with a wordpress and GoDaddy hosting. I want to make fast web page, because people says fast web pages appear on first line at Google (as specially mobile web page speed is very important people says). So want to make very fast web page but my level of knowledge is not very advanced, I progress by learning.
If I test my web page with Insights, mine mobile score is about 60-70. If I read reports of Insights there are lots of improvements links appear at blow. I want to learn how to fix that. If you help me make an example, I will do the others myself.
If we start at first problem which is /css?family=…(fonts.googleapis.com) this problem seen below of "Eliminate resources that prevent rendering" topic. So how to fix it. What should I do?
Also at the "covorage" tab there are some source codes are seen and it is not using. For example I am not using easy-sheare plugin (secong row at the image) at homepage.
How to remove safely that codes from home page. If I can learn how one is made, I can correct the others myself.
The issue you are running into is something I have seen over and over again. GoDaddy and Wordpress sites generally are bloated and perform poorly.
Here are some tips to improve your speed & get a better PS ranking.
Hosting: Do you need to be on Godaddy? I have seen this time and time again. Most websites on GD are SLOW. GD is good for domain registration, not for hosting. Most non-tech folks do not know any better. Try using Amazon Lightsail, AWS-S3, Google Firebase, or Netlify. They all offer much faster page loads by reducing initial server response time. And they are surprisingly simple to learn and deploy.
CDN: You must use a content-distribution-network (CDN). Check out Cloudfront. They offer a free tier that works quite well.
Wordpress: This is your real issue. Wordpress is neither easy to build nor easy to maintain. You need multiple plugins to make the site perform. Best you build your own. If you have to be on Wordpress checkout image optimizers, minifiers, and cache plugins. Gumlet, WP Rocket, Shortpixel are quite popular to improve speed.

Implementing recaptcha on Github pages without php?

Absolute newb here, please forgive me for this basic question.
I have built my portfolio site using Github pages, but am experiencing spam via my contact form (hosted by GetSimpleForm). I am trying to implement Google reCAPTCHA, but I'm a bit stuck in the backend part. As I understand, Github pages don't support PHP, so I can not actually complete the form verification.
Google documentation here was unfortunately a bit overwhelming and cryptic to me as a beginner, since I just stared at my Github html/css/js files and had no clue what to put where.
Am I trying to do the impossible? Is it possible to use reCaptcha on Github pages? If so, is there a beginner friendly tutorial somewhere or a straightforward "copy-paste" thing I could use? (so far, it's not been clear where to use the secret key from the API key pair for example)
Thanks a bunch for any leads or alternative solutions for spam prevention that would work in Github pages!
The short answer is you cannot. Github Pages only support static site. You have to host your own website if you want to do some complex stuffs like backend check etc. and mostly they are not free.
The only suggestion I can come up is simply change your contact form to regular html form instead of hosting by the 3rd party website you are using. I suspect that the main reason you got spam is because you are using it's service.
A really simple way to do it is to make the form with HTML (you can either copy the code from a pre-made HTML site with a form, or find a youtube tutorial that shows you how to make a HTML form, pretty simple), and host it on something like Netlify. Netlify is free for static websites unless you are doing something really complicated, and it has a built in form submission that will send you an email automatically every time someone fills out the form. You don't need PHP or a third party app or anything.
You still create and edit the code of the website through Github, you just need to connect it to Netlify for the forms. I'm a complete beginner and I figured it out. Netfly has some tutorials that explain it nice and simple. No reason to pay or do a lot of complicated stuff, and you can make professional websites with just HTML and CSS.

Website creation query

I need to create a website which stores the list of all games the player has played and it shows right on your profile. As the player goes on completing a game, he adds the game into his list.
So i would need a basic lo-gin configuration and then by using AJAX, I will populate the list of games which he wants to add to his list. So that he can track the list with games that he has played.
So now I need suggestion on how to go on with it?
How to start building?
Which language do I need to pickup?
I am well versed with Java and j2ee.
Is this enough?
Also I am a freelancer so I can't afford to pay for a website. So any free website hosting service which will help me to build the website which I have in mind??
Also if I use any free website hosting service, will they provide me with a database and AJAX capabilities?
Here's the basic setup:
You need a domain first. Try to pick something unique, as it will be cheaper. You can find one on namecheap: https://www.namecheap.com
You need hosting. Again, go with namecheap.
To start building, you need to learn some HTML and CSS. HTML is markup of the web, and CSS is the stylesheet of the web. They aren't hard languages to start off in. You can start for free at Khan Academy: https://www.khanacademy.org/computing/computer-programming/html-css
I believe namecheap offers database support as well. Ajax isn't provided by a hosting service. It's more of a group of languages (HTML, CSS, JavaScript).
This should get you going. I can't really give you more detailed information than this because your question is really broad. If you Google your questions, you'll get good answers and guides.
Best of luck.

Content Water Marking

We have members-only paid content that is frequently copied and republished without our permission.
We are trying to ‘watermark’ our content by including each customer’s user id in a fake css class, for example <p class='userid_1234'> (except not so obivous, of course :), that would help us track the source of the copying, and then we place that class somewhere in the article body.
The problem is, by including user-specific information into an article, it makes it so that the article content is ineligible for caching because it is now unique to each user.
This bumps the page load time from ~.8ms to ~2.5sec for each article page view.
Does anyone know of any watermarking strategies that can still be used with caching?
Alternatively, what can be done to speed up database access? ( ha, ha, that there’s just a tiny topic i’m sure.. )
We're using the CMS Expression Engine, but I'd like to hear about any strategies. They don't have to be EE-specific.
If you're talking about images then you could use PHP to add a watermark to the images.
How can I add an image onto an image in PHP like a watermark
its a tool to help track down the lazy copiers who just copy the source code as-is. this is not preventative, nor is it a deterrent. – Ian 12 hours ago
Going by your above comment you are happy with users copying your content, just not without the formatting etc. So what you could do is provide the users an embed type of source code for that particular content just like YouTube does with videos. Into that embed source code you could add your own links back to your site, utilize your own CSS etc.
That way you can still allow the members to use the content but it will always come out the way you intended it with links back to your site.
Thanks
You could always cache a version that uses a special string, like #!username!#, and then later fill it in with PHP based on which user is viewing it.
Another way I believe is to switch from caching on the server to instead let the browser cache it locally for a little. That way it is only cached per user, and it reduces the calls to your database. Because an article is pretty static, you could just let the local computer cache it, and pull in comments via javascript.
This last one is probably not one you are really looking for, but I'm gonna come out and say it anyway. You could not treat your users like thieves, and instead treat the thieves as thieves. Go to the person hosting the servers your content is on and send them an email telling them copyrighted premium content is being hosted on their servers without your permission. You can even automate that process.
How to find out what sites are posting your content? Put a link in the body content to your site, and do a Google Search/Blog Search for articles linking to that site. To automate it, use Google Blog Search because it offers RSS feeds. Any one that has a link back to your site could go into a database with a link to the page, someone could look at it, and if it is the entire article, go do a Whois and send them an email.
What makes you think adding css to something is going to stop people from copying it without that CSS? It's more likely that they are just coping the source of the content you are showing them and ignoring all the styling around it. For example, I use tamper data to look at all HTTP requests made by Firefox, if I can see it on the page, I can see it in the logs. Even with all the "protection" some sites try to put in place, they generally will never work. I can grab what I want, without using any screen capture/recording.
If you were serving flv's, for example, I would easily be able to grab the source of that even if you overlayed it with some CSS. I think the best approach would be to get the sites publishing your premium content and ask them to remove it. It's either that or watermark the actual content on the fly while sending it to the browser.

Mixing Secure and Non-Secure Content on Web Pages - Is it a good idea?

I'm trying to come up with ways to speed up my secure web site. Because there are a lot of CSS images that need to be loaded, it can slow down the site since secure resources are not cached to disk by the browser and must be retrieved more often than they really need to.
One thing I was considering is perhaps moving style-based images and javascript libraries to a non-secure sub-domain so that the browser could cache these resources that don't pose a security risk (a gradient isn't exactly sensitive material).
I wanted to see what other people thought about doing something like this. Is this a feasible idea or should I go about optimizing my site in other ways like using CSS sprite-maps, etc. to reduce requests and bandwidth?
Browsers (especially IE) get jumpy about this and alert users that there's mixed content on the page. We tried it and had a couple of users call in to question the security of our site. I wouldn't recommend it. Having users lose their sense of security when using your site is not worth the added speed.
Do not mix content, there is nothing more annoying then having to go and click the yes button on that dialog. I wish IE would let me always select show mixed content sites. As Chris said don't do it.
If you want to optimize your site, there are plenty of ways, if SSL is the only way left buy a hardware accelerator....hmmm if you load an image using http will it be cached if you load it with https? Just a side question that I need to go find out.
Be aware that in IE 7 there are issues with mixing secure and non-secure items on the same page, so this may result in some users not being able to view all the content of your pages properly. Not that I endorse IE 7, but recently I had to look into this issue, and it's a pain to deal with.
This is not advisable at all. The reason browsers give you such trouble about insecure content on secure pages is it exposes information about the current session and leaves you vulnerable to man-in-the-middle attacks. I'll grant there probably isn't much a 3rd party could do to sniff venerable info if the only insecured content is images, but CSS can contain reference to javascript/vbscript via behavior files (IE). If your javascript is served insecurely, there isn't much that can be done to prevent a rouge script scraping your webpage at an inopportune time.
At best, you might be able to get a way with iframing secure content to keep the look and feel. As a consumer I really don't like it, but as a web developer I've had to do that before due to no other pragmatic options. But, frankly, there's just as many if not more defects with that, too, as after all, you're hoping that something doesn't violate the integrity of the insecure content so that it may host the secure content and not some alternate content.
It's just not a great idea from a security perspective.

Resources