I imagine a very common scenario is where an entire dynamic page can be cached in such a way that an entire framework/CMS stack can be bypassed, except that some small amounts of information change depending on whether somebody is logged in or not. For example, the menu might change from "login" to "Welcome Somebody!". No there's not way to cache the page obviously.
One solution I was thinking of would be load this information via AJAX after the page has loaded already.
Does anybody have an advice here?
Write the page stream to the file system. Name the file with the entire URL including the query string. If the page contains session data, include a session id in the file name. Keep a list of cached pages with their names somewhere so that you can look up whether something is in the cache without having to go to the file system.
This is essentially what FatWire Content Server does.
Since this appears to be language-agnostic, you could create a temp file with the raw output of the page, and then when the same page is loaded again, dump the contents of the temp file directly into the HTTP response of the current page.
Related
Hoping someone here can assist with giving me a few ideas on what to check and where to look with a new issue I have.
I have implemented W3TC Caching on a website to improve loading times but it has created a problem with page loading - the pages now load garbled text on first load and then after refresh it seems to work fine.
I am 99% sure it's W3TC because when I clear the cache, the issue happens and then after a refresh, it goes away.
This is what is displayed before refreshing:
����v�'����"fd��t�o�o���<{Jz�F'ؼ�Tʡ ���>����ꛯ�EM����7ֈ��e����l{��#�ƛ��oA�
G�b�r,��z�U�]�3e��<����hI���2�,�Z�}RuO�i��Ck�T,����|^�q�~�F#M�R�E$k�k4��c��b��ޔ�C�����F�gc���#/H}V���AM�2�ӡE��"u�Zz �U���tFM�Ed��qduڞ3��#�����
��{ޔ&��������GJM;�b��)�
��oN�3�Tq7ɦZϞ�?aUo���
k�h�!A��D�CF�Y?�ȟ��?O8�5X#')+?.د��!����V�Q�q6�E�
0s���g3�1[�T �Nq�ң��beLq"��^YhhƓ�H&�O��8Zn�1�2�znon�� ��Z<�W�0�L��̖0ܞ����GqhQ�]g��U�z�}�����'��'��UG����gS��������o��*�3N=E��!i�,��\�ĝ�
d�C�F"
�۳���e�U�H�1�bʛV�J�>Y3�Әk$����9��Q�$��厁�q� ��Hy�.�vh]'�(�SqL�Z�m�è5�wL,�Bޑ#/����ʷV�����٦V�
ίB2=�nR#���4��b�J�B�Ë��{go�4R���,�g�d���V��خ�j͊V%����V��V���1�ɇ{������Q�Ѩ�?�(��ڃ�^���4� 0T��E?�N�[�D��W�H�Q�x/����S��7��m69�dr�.��Ǘ���,~�C6fRm"��m�C��,"�id% ��U2F+�����h o/hU�&�J�^N�Z�,�#$�s�"M"v�Y�[f��0G�N��B0�����Ci���� 5��&k������F�r���*��jR����g���v��
�{�W�=�T��09�o�wC���e9���l�}��/_{\�S9=��$����ת����DlN����!�0xy0<��!4N��Z䏋�t��.�{�xwLkT."o��&���#_H�A8,Q?2P
;#��>>����*��45Պ�A�ZZrY�9�:̬��K������ =�����n����]ہw�*��H�:��t�0���B¯�a���Q�mW>c�:ID�G��1;=�T���0!��"�1 �X}}0Bm�!�����;_k2��'<R�qY�'ё��W L�sR7�7W��|߇�ۊ�1�Kfg�"�P�}�Cb�3A����wk��b d����s��ϸL��Ŕ�����6���-�^0���&�e$� � ��<�F�ZO�x��V�!��Y�N$���A�.�s�1������~T�OE���g�����s�������2&l1���DB�EB�B9O#����#r��F��{]ⴻ'u7cpxũW�[fo�MSH���%Uz/cH�Wc�e�����֥:M�M[��2�w[�͌(B?:��-���������8�o��N�֓^�^.'��EL�e����4z6sڽ^�iz�^�S�� �r탇֠m����"ˆ�q���#,\���H�,g)<��t����Ԉ+֢�>�7�f,�ݨ���j-�Y��JLpEC�u<�5������V��2��՚/���ze�n^�E�47���^O���r b)�4^��B��Z%ֱ���1��ћ]��C�Hh�ݪM�����F?#��o"���%Ո�.tR�fP{��[���T�6�S��Iy��p����H��k^�e�����O��[U�����S;�β��+ݪ��.Dw?~h�27��F�t;�Ӹ/S���L�1�X�����\������/�~|�T莥JR�_�{ ���y3+��
��p���7�h|�c��&b���z�2��-S��3�� �J$�4��ɫUۊ�z\o��?��w�wd�n��O�q�ݺ��]ݻa!a�����[}�Nc.���1cI�[n ����Vv��t���0?��E�4�J�n�h�qԐ۹#M1�臭���Ku���� ��k��g&t��s���ػ���nyk�b�4i���fn�����{���T^�F�Z�V]� �{i��>{��/�Oy�l搗��2)wa|yd�֮k
w����N�]UaLme�
�V����q�폋#y��"V��d��2������K{'4Q�6�1��R^�:o�"&FQ�$U����
���)���ڔ
��Y�IW�H�#�-E��HK��6���I<:�ㄆlOW0A�"�1G�q��"�����nHc���p� �>~4ᣵ��-������7��N����a�Ys��n�1/���[�����A�i5z�^��k�7���߄-�R�4���&�u����Ӫ9�p�'�����6p�p��ߢ�����'���N{ܣ�n�ў�ߔ��y�j�s����ͳ��
��K�|8�׳�Ö�kc�i
����wXF�d���~{�������5�e�j�������-[#ӡ��Z���� �^�X���Ђ�B��3m��~IcU �f5(���4P��J��ERI�����FD4�<�:�~$�⧛�a�-�;/��Ƞ��&(m
��Ͱ��~a3�J�):�q�z�A�2��&�heTeT�§��]�?]>�E䣝j!ƣ�_�v�8-�t�i���s1��%��Up�M��C�����T�0^��\��(���.��q�9(���zK�v���h����T� "����2�hUPTɇf����<�M�Q����'���,bh)s���.Q���p�^�Dv"SۿC��r��qfpo9�&��:�%�-�W����ư�.U�u������ʜr�4a4T ���QE֎Qzw:F)}f���c��4�^��IX���,8��r�Y��.̳5��m�L#+���+a��Om����[= V��$Dx��'c���E����y�J�7�E��q�ٱ�S��I|�;Pkg8�Yۉ�X#�2�����NJ��2�y�6�Td�B�,�P� �*t4�猒,^�\�]pĎ��N�j��0�����ˌg���+�aNX�5���R��ld�֭97���ʢK덆�Ӥ���5~ ��[M�4X��X�u�ʼn+��n��
(���[�F'PS6�j��)�9��k*�e �����[�
����d����hO�#m��>�౩t���<h���ele!��L���M�V�3��+�٣F�=�/m*a�}Ly��M}��Z89\�o ��z�S߇�,w(��L{�qd#��FA�Y�o~iHۙ<B/l}#"H:��� �^���,c��n(
o�>�/I��c�㲨=��!WKm]���6q����M$6��Qۭ~_7�e�?�}��C,��HCR,���G�m�B��o�t�Uk-t�\QY��<�٧�e���E.��J�/!VC�����)�d�Y��
7/���ugm��#�-R\e�g��#����:2�|Y}�]���gk������Ag���X<�6�<�kj�M�n�ݹ�l�R�H��E�k/�h�K|��.q�.Q\|�Av���S�eQ�JR0���=�kR�뷌��F�]cu�m�v6����g*�� �++�-hP��ܺOY�zm4�?�^Z����z���tzǒ�����&s��;�E�Oгi�w~��5��L|���{���4����ܾAzno��ۍmJ�����o)��s��nB���7�3S���r{�2�;��;��vroI�V�rj/����2I�E�oed٭Z��|�6k+߬�ܣ�O�4���!���ӛ_�-��7'��IlsEd���6F�òo�,&����Q�����pz����[��}��O��HRi�lF�?���gŹ�0�E�E}�\͚�$�D�c>�x;��O��~���?���REn>�Կ$G~B��[?I�iT���Fe4� 1{��g+a��7���g+���uH���,��?����K�d�,1�{*B!��Bլ����C>r�����}�?�ޚ�BVs��{��u�3 �{2��_*_F
This kind of thing happens when there's an issue with the content encoding, which is typically down to compression settings.
Usually either the Content-Encoding header is invalid or missing, or the Vary one is. I don't know W3TC, but a quick search for "W3TC content encoding error" brought up a few results, so fortunately this one is an issue that has happened to a few people.
Apache default compression settings
Again I don't know W3TC, but from implementing similar caching setups, the first time it sees a request for a file which hasn't been cached yet, it'll build a .html file, compress it with something like gzip, and save it as a .html.gz file. Whenever a secondary request comes in, Apache can then directly serve up that static file as-is (knowing it's compressed already because of the file extension).
This issue occurs because it then outputs the gzip data to that first requester. By default, Apache compresses responses (unless it knows not to), so the result is it's being compressed twice.
So, the possible options:
Turn off Apache's default compression settings by disabling mod_deflate on your website (Assuming all your requests go through W3TC anyway, this is probably the route W3TC would expect)
Edit W3TC or your website by adding something like apache_setenv('no-gzip', '1'); which has the same effect as the above but is more controllable as to which requests it applies to
Turn off W3TC's compression (I wouldn't do this though; consider it a last resort!)
To display a dynamically loaded image in my webapp I'm using a BufferedDynamicImageResource. (It just loads the image from a backend server based on an database id.)
The URL of the image resource ends up as:
http://localhost:8080/wicket/page?17-IResourceListener-logotype
^^
sequence number
where the sequence number increases for each such image I generate.
The problem is that the URL is reused from execution to execution (the sequence number is reset to 0) so when I restart the server the browser does not fetch the newly generated images, but instead uses the cached versions (which were generated last execution of the webapp).
My Question: What is the best way to avoid this behavior? (If I could for instance add the database id of the image which is loaded to the URL, everything would work fine.)
The most common way to solve this would be to mount the resource as seen here. Using this approach, you could use the id as a parameter or add an (ignored) random parameter to prevent caching completely.
I have created a Processing code (.pde file) to make a time series (coffee production v/s time) which takes its data from an excel file(.tsv table). Can anyone tell me how to include this to my webpage?
I have tried with processing.js but it does not show anything in the browser.
without additional information, you probably have your .tsv file in a "data" directory, but aren't explicitly loading it from "./data/myfile.tsv", instead relying on Processing to autoresolve. If you intend to use your sketch online, always include "data/" in your file locations, because browsers resolve locations relative to "where the page is right now".
I am using dompdf to create pdf files. But whant I want to be done is to create once the file, so the user can see the contents, but protect the file so the user once he close it, he can't reopen it later. Is that possible OR should I use other program?
This really isn't possible. It sounds like what you want is for the document to be destroyed after first reading (Mission Impossible style). That's not how the web works. A file that can be accessed over the web can be easily downloaded and opened offline.
Certainly there are hacks around this, but they would be fairly involved to implement. I once created a Flash-based viewer that loaded another file that contained the actual document. Any tech-savvy user could still obtain the original document by examining the network traffic, but your average non-technical user wouldn't know how to do it.
You do have options for enabling restrictions in a PDF, but the user will always be able to save it and re-open it later. Probably what you want to do is implement restrictions on the document and load it in an iframe to prevent saving.
You can implement print/copy restrictions as follows:
$dompdf = new DOMPDF();
$dompdf->load_html($html);
$dompdf->render();
$dompdf->get_canvas()->get_cpdf()->setEncryption('', 'ownerpass', array());
$dompdf->stream();
The parameters of setEncryption are:
string, user password (restrictions apply)
string, owner password (unlocks document)
array, strings indicating allowed actions when user password is supplied (e.g. print, copy). If left blank the user is limited to saving the document.
A pdf is a document, it has no scripting instructions, maybe you want to embed it in an exe, have the exe extract it, and keep checking the lock bit, as soon as it is clear delete it.
I have a custom file type that is implemented in sections with a header at the shows the offset and length of each section within the file.
Currently, whenever I want to interact with the file, I must either load and parse the entire thing up front, or else pick only the sections that I need and load just them.
What I would like to do is to achieve a hybrid approach where each of the sections is loaded on-demand.
It seems however that doing this has a lot of potential downsides in terms of leaving filesystem handles open for longer than I would like and the additional code complexity that I would incur.
Are there any standard patterns for this sort of thing? It seems that my options are to:
Just load the entire file and stop grousing about the cycles/memory wasted
Load the entire file into memory as raw bytes and then satisfy any requests for unloaded sections from the memory buffer rather than disk. This saves me the cost of parsing the unneeded sections and requires less memory (since the disk representation is much more compact than the object model around it), but still means that I waste memory for sections that I never end up loading.
Load whatever sections I need right away and close the file but hold onto the source location of the file. Then if another section is requested, re-open the file and load the data. In this case I could get strange results if the underlying file is changed.
Same as the above but leave a file handle open (perhaps allowing read sharing).
Load the file using Memory-Mapped IO and leave a view on the file open.
Any thoughts
If possible, MMAP-ing the whole file is usually the easiest thing to do if you have a random-access pattern. This way you just delegate the loading/unloading issue to the OS and you have 1 & 2 for free.
If you have very special access patterns, you can even use something like fadvise() (I don't the exact Win32 equivalent) to tell the OS your access intend.
If your file is more than 2GB and you can either go the 64bits way or to mmap() the file on demand.
If the file is relatively small, mmap-ing the entire file is good enough. If the file is large, you could leave a mmap view open, and just move it around the file and resize it to view each section when needed.