Hi I am using DinkToPdf to generate pdfs in dotnet. Just wondering if anyone knows of a way to change the header on some of the pages? eg. Page 1 has a certain header, Page 2-10 a different header and pages 10-20 have a different header again.
I was wondering if I have to generate 3 pdfs and then stitch them back together at the end?
Related
Is there a certain way to check which pages on a website use a specific image?
Say I have some image which I don't use on a page anymore, so I'd like to delete it from my server. But I'm not entirely sure if it's being used on other pages, is there a way to check if it's still being shown on other pages?
You can hook your website to google webmaster tools and wait a little bit after a while 404 errors will appear there. This way you can track unused resources and dead ends.
This includes images.
There is a better way if you have direct access to the web server.
Visit every page in your website or let google crawl it.
You can later sort the files by date modified and ones which are not modified lately are not used.
You have to make sure you get the images from the pages so I would use a historyless cahceless session.
How to sort the files according to the time stamp in unix?
I was reading about attacks on sites with the ability to upload and download files. Some attacks were about uploading a jpg which is really a html file and a comment about what if you want users to be allowed to store html and download them (or perhaps view them in the browser w/o using the save as feature).
Is there some type of flag i can use to say do not execute? I will want users to view images or video files other have uploaded. What if i'd like user html to be displayed but i dont want to force users to download them (content-disposition attachment).
Is there a way i can say hey here is some user data. It could be an image so i should allow img src to work. It could be an html so i'd like users to see it but dont allow it to read/write cookies/localstorage/call ajax request/etc?
-edit- Come to think of it. All of my user data is hosted on its own cookieless subdomain for static files. That would get rid of many problems i mention but what else is left to deal with? Also i believe my mime response completely depends on what my web server does (nginx atm) which could simply be look at the file extension.
-edit2- I adjusted my nginx config to add the application/unknown Content-Type. It seems to do exactly what i want. I saw a suggestion to use octet-stream for unknown files but that causes browsers (at least firefox) to try to download it even if its a jpg capable being viewed in browser.
It all depends on the Content-Type in your HTTP Response.
Browsers handle the data returned by the Content-Type in HTTP response.
For example if let say a user uploads a HTML file in a upload field supposedly for photo upload, as long as your web server gives Content-Type as image/jpeg (or image/png et al) the browser should handle it as an image - and in this case an invalid image because the image contains weird HTML stuff inside instead of the usual binary.
In any case, if you are feeling unsecure, you can always peek into the file data during upload validation.
To try to minimize the number of requests in my application, I use base64 for small images in my css files.
But viewing the requests to the Google Chrome Developer, see each image generates a request.
Why this? How do these images do not generate these requests?
This is my application: dev.bindsolution.com
I have a website that displays product information that the client wishes to offer as pdf format. I need a way to dynamically convert a particular HTML page into a PDF, does anybody know of a way to do this? I need to convert an html page into a PDF document and serve it to the end user on the fly (there are WAY too many products to do this manually and these products receive updates regularly so a manual approach is out of the question)
EDIT: I forgot to mention that I need this to use either vb.net or c#.net
Have you tried iTextSharp?
I haven't had a huge opportunity to research the subject but I figure I'll just ask the question and see if we can create a knowledge base on the subject here.
1) Using subdomains will force a client side cache, is this by default or is there an easy way for a client to disable it? More curious about what kind of a percentage of users I should be expecting to affect.
2) What all will be cached? Images? Stylesheets? Flash SWFs? Javascripts? Everything?
3) I remember reading that you must use a subdomain or www in your URL for this to work, is this correct? (and does this mean SO won't allow it?)
I plan on integrating this onto all of my websites eventually but first I am going to try to do it for a network of flash game websites so I am thinking www.example.com for the website will remain the same but instead of using www.example.com/images, www.example.com/stylesheets, www.example.com/javascript, & www.example.com/swfs I will just create subdomains that point to them (img.example.com, css.example.com, js.example.com & swf.example.com respectively) -- is this the best course of action?
Using subdomains for content elements isn't so much to force caching, but to trick a browser into opening more connections than it might otherwise do. This can speed up page load time.
Caching of those elements is entirely down the HTTP headers delivered with that content.
For static files like CSS, JS etc, a server will typically tell the client when the file was modified, which allows a browser to ask for the file "If-Modified-Since" that timestamp. Specifics of how to improve on this by adding some extra caching headers would depend on which webserver you use. For example, with Apache you can use the mod_expires module to set the Expires header, or the Header directive to output other types of cache control headers.
As an example, if you had a subdirectory with your css files in, and wanted to ensure they were cached for at least an hour, you could place a .htaccess in that directory with these contents
ExpiresActive On
ExpiresDefault "access plus 1 hours"
Check out YSlow's documentation. YSlow is a plugin for Firebug, the amazing Firefox web development plugin. There is lots of good info on a number of ways to speed up your page loads, one of which is using one or more subdomains to encourage the browser to do more parallel object loads.
One thing I've done on two Django sites is to use a custom template tag to create pseudo-paths to images, css, etc. The path contains the time-last-modified as a pseudo directory. This path component is stripped out by an Apache .htaccess mod_rewrite rule. The object is then given a 10 year time-to-live (ExpiresDefault "now plus 10 years") so the browser will only load it once. If the object changes, the pseudo path changes and the browser will fetch the updated object.