I see websites like Wufoo that give every new customer a way to register their own subdomain on their site (ex https://mystuff.wufoo.com) What are the benefits of doing this? I'm specifically curious about technical pros and cons of this approach.
Why not do https://wufoo.com/mystuff ? Is it just aesthetics?
An obvious drawback that I should have spotted. If you do https://hostname.com/mystuff that means that mystuff cannot be used for internal use. This is going to be causing issues with your page names like https://hostname.com/pricing <- pricing is taken and users will not be able to register this name...So it is inconvenient to both users and developers.
Related
I was wondering about the best practice when using the MVC pattern.
When you develop an app for a client, you want to think business. You want to think as the customer would. That's why I'm wondering :
Isn't it better to develop the view part, without any data treatment, so the customer can validate it ?
I see this practice as powerful as TDD is, I mean if you clearly know what your program will look like, you know which treatment it will require, making the model part a bit more concrete and business oriented, instead of making it too abstract and global.
I can not see downsides to this, so if you can see some, or explain me why it's not a good idea, please do.
Thanks :-)
The main benefit, as I see it, would be the ability to provide the client with hands-on prototype.
It not uncommon for clients to change their mind, because often, when they hire a developer or company, they actually have only a vague goal for the end product. This way you would mitigate the risk of large scale changes late in the projects life-cycle.
As for implementation of such approach, I would recommend for you to look into concept of "presentation objects" (Fowler has this annoying habit of slapping "model" on every damned term).
With presentation objects you would gain an ability to "shim" the data from model layer's services. And it also would let you figure out, what exactly services (and service calls) will you *UI layer( interact with).
Note: of course I am assuming that with "MVC" you do not mean some Rails-style abomination, where "views" are just dumb templates.
Can anyone recommend a good option to translate websites into Spanish? We tried using the Google translate plugin but the translation was so rough (very inaccurate, bordering on embarrassing the company) we had to hire a company to refine the translation so that it was much more accurate which makes for an extremely inefficient process for updating the site moving forward.
We're in health insurance, so the language we're translating is very specialized in nature and needs to be accurate for our members. To make it even more complicated, the Google Translate plugin happens instantly, so the translation is live before we have a chance to refine it before users can see it. In other words, there's no way to refine the translation before you make the content visible to users in the production environment. This is a legal regulatory requirement for Covered California and the Affordable Care Act, so it has to be a top notch implementation.
Short of a proxy solution that intercepts the content before it hits the production site or a separate site coded in Spanish, I'm not sure what other solutions exist if any. Ideas? The separate site solution is also problematic because it requires a bilingual staff and it doubles the work because both environments have to mirror each other exactly at all times.
Recommendations? Ideas? Any suggestions based on experience are most welcome!
Hire developer - he will describe all you need. You will never do it by your own. If you already have - hire new one, he will know how to do it. Question is very spiciefied but any (let's take for example php) php-engine (framework) or even custom php-engine can be updated the way you want.
Preview before upload to public? Easy! Change by moderator|admin values of translations? Easy! Main thing that each sentence (or even paragraph) you will describe by your own... I don't want describe all mechanism of it - hire developer and he will do all you need. $)
Well, my question is, which I should use when?
For example I use AJAX calls for all my form submissions - both user creations, advertisements creations, editing advertisements and so on.
Is there any golden rule about when to use event-tracking and when to add a page-view manually? :)
Is there any disadvantages in one of them regarding to e.g. goal tracking and so on? :)
Thansk in advance!
The important thing is that you can't have funnels with event based goals, so if you want to track at which point people drop out of a multi-step process you would need virtual pageviews.
Other than that there are only "best practices" which are somewhat open to interpretation - I use virtual pageviews when I update significant parts of the content, but your idea of what is significant might be different than mine. So you need to get everyone who uses your tracking implementation on board - they need to be aware what parts of the site are tracked in which way, especially if they try to compare your site with another - and you should have extensive, regularly updated documentation.
There is more than one way to do it, so it's important that your way is well documemted and reproducible.
I'm testing mobile internet and noticed the provider is using a filter for senstive content.
What approach are they using exactly ? Would it be a whitelist ? Because I imagine it might be impractical to screen all sites while haveing the risk of a child ending up on a site they shouldn't be on.
Or would they be using a third approach ? Say, a clever filter that scans for words and weights results.
Neither; do sanitization: https://www.owasp.org/index.php/Data_Validation#Sanitize
Doing both blacklists/whitelists leaves you in a "circle" of constant updating and management along with other issues.
There are companies that sell ready-made databases categorising sites by type. Then your provider would just decide which categories they want to let through and which to block - see e.g. http://technet.microsoft.com/en-us/library/ee207145.aspx "URL filtering is subscription based, and is part of the Forefront TMG Web Security Service license."
Of course the reputation of such schemes is pretty poor, with problems from towns such as Scunthorpe or people selling wristwatches (bad words embedded in the name) or sites about various cancers (on the assumption that anything about those body parts must be naughty).
Congratulations to Stack Overflow if this post gets through - although I have tried to make its job as easy as possible.
Hoping someone can chime in on an ideal methodology.
I don't want to run my site through a crawler every month to add new pages to my sitemap, I'd like some robust systematic method to do so, because maintaining it by hand seems very prone to ahem human forgetfulness. Is there some sorta way to programmatically validate new controllers, controller methods, views, etc. to some special controller? What I'm picturing is some mechanism that enforces updating the sitemap whenever you create a new controller method or view. I work in LAMP stack if that's relevant. This guy here is doing it through the file system and that's not what I want for a public facing sitemap.
Perhaps there's another best practice for this type of maintenance other than the concept I'm proposing. Would love to hear how everyone else does this! :)
If your site is content based, best practise is reading database periodically and generating each contents link. With this method you can specify some subjects are more prior or vice versa in sitemap.
That method already mentioned before at topic that you linked.
Else, you can hold a visited pages list (static) at server-side. Or just log them. After recording your site traffic, without blocking the user experience, I mean asynchronously, check the sitemaps and add your page links there. You can specify priority with this method too, by visiting intensity of your pages and some statistical logic.