I want to build a web page where I separate Database, Backend and Frontend and make then communicate via REST. I got quite confused about how to structure the project(s). As I read, there are the following approaches:
Make different projects: one for the frontend (let's say Angular) and one for the Backend (Spring) including the database connection. These are completely separated from each other and different IDE's might be used.
Build it in one big project but still use REST for communication (see picture below).
Now what I would like to know is what the difference between these two approaches is? I do not know (and do not ask) which one is the better one, but I cannot even make out reasons or effects to pick one above or below the other one.
This is really question about personal preference, but you have them be in completely different projects and at the same time you can put your JS bundles (from ng build) inside of the resources/static folder (of the Spring app) and it'll work perfectly, assuming you want to run them on the same server.
You can set a proxy config to make it easier like:
{
"/api": {
"target": "http://localhost:8080",
"secure": false
}
}
This way whenever you do a rest call with something like Angulars HttpClient, as long as you put /api in front of the url it'll call your spring backend.
Example:
public fetchResource(id: number): Observable<Resource> {
return this.http.get(`/api/resources/${id}`);
}
I prefer to have my client and api in different projects.
Whenever you want to add the JS bundles to your resources/static folder, you can just create an NPM script to do it for you in package.json.
Related
I've searched and searched and can't seem to find a pattern for this. I'd consider myself an intermediate Vue dev, however, the backend is my strong suit. I'm working on an app that will be white-labeled by resellers. While it's possible to have multiple builds, avoiding that would be ideal. The setup is a stand-alone vue-cli SPA connecting to a Laravel api backend and using the Sanctum auth package. So I need calls to the same domain. The issue: resellers will be on their own domain. The ask: Is there a pattern/solution for dynamically loading configs (mainly baseURL) for different domains (other items would by theme/stylesheet). Currently I have a few typical entries:
i.e. axios.defaults.baseURL = process.env.VUE_APP_API_BASE_URL
Basically, based on the domain the site is being served on, I'd like a dynamic/runtime config. I feel like this has been solved, but I can't seem to use the right search terms for some direction, so anything is helpful. I've tried a few things:
1) Parsing in js, but can't seem to get it to run early enough in the process to take effect? It seems to work, but I can't get it to "click"
2) Hit a public API endpoint with the current domain and get the config. Again, can implement, but can't seem to get it to inject into the Vue side correctly?
Any resources, pattern references or general guidance would be much appreciative to avoid maintaining multiple builds merely for a few variables. That said, I don't think there's much overhead in any of this, but also open to telling my I'm wrong and need multiple builds.
End Result
url visited is https://mydomaincom
then baseURL = https://api.mydomiancom
url visited https://resellerdomaincom
then baseURL=https://api.resellerdomaincom
I don't think there is a common pattern to solve your problem - I haven't found anything on the net.
The best software design solution could be the following:
have a single back-end
distribute only the client to your customers/resellers
Obviously the back end could see the domain of the application from which the request comes and manage the logic accordingly.
Good luck with your project.
Honestly how the question is put it's not really clear to me. Although my usual pattern is to:
Create an axios instance like so:
export const axiosInstance = axios.create({
// ...configs
baseURL: process.env.VUE_APP_URL_YOU_WOULD_LIKE_TO_HIT
})
and then whenever I make a request to some api, I would use this instance.
EDIT: According to your edit, you can either release the client to each customer, and have a .env file for each and every of them, or you can have a gateway system, where the client axios end point is always the same, hitting always the same server, and then from there the server decides what to ping, based on your own logic
I am currently using the Marklogic spring boot demo. So far I have been able to add indexes, facets, front-end logic etc just fine.
Right now, I am trying to layer in some semantic logic into a rest endpoint.
I wrote a simple query into the query console, and attempted to add it to the src/main/ext folder so that it would get deployed by the ml-gradle bootrun.
Right now. This file will get deployed to the test-modules database, and is visible once saved (I can see it in explorer under URI /ext/my-endpoint. I also tried adding a folder named rest-api but that just adds it to /ext/rest-api/my-endpoint
At the top of the endpoint I have it declared as
`module namespace ext = "http://marklogic.com/rest-api/resource/my-endpoint";
However when I navigate to the URL it should be living at http://localhost:8090/LATEST/resources/my-endpoint?
It tells me it does not exist.
So I can see it in the modules database, but I can't use it on HTTP. the Query works in the query console (and is rather trivial, and-query of json-property-word-queries)
My question is:
How can I properly update the marklogic-spring-boot framework to properly deploy rest endpoints.
So I figured it out it seems.
Manually creating the file isn't registering the distribution workflow properly.
Instead I create the resource using ml-gradle
gradle mlCreateResource -PresourceName=my_endpoint
This will create a new folder called services, and create the endpoint for me, which can then have code over written.
Still not sure what gradle is doing special, so I can know what the proper way to do this manually would be, but at least it works.
I am developing a simple CRM app using Laravel 5.2 and ReactJS. Previously I was using them independently, but now I want to try to combine them together so Laravel will be my API and front-end will be all in ReactJS.
As far as I know when my app is ready to go live I will serve my master view which includes root div, bundle.js etc.
When it comes to development part I am a little confused. I really love react hot reload, but for now I had to do a little walk around to make this works.
I run two servers side by side. Webpack-dev-server and homestead, so I am able to do calls to my API. But I also have to have additional index.html for webpack-dev-server. When i change something in my index.blade.php view I also need to change this in this index.html and this is a little bit of pain.
Is there any cool trick that I can apply to improve my workflow? If there is any example please provide me a link, because I wasn't able to find one. There are many small todo apps that doesn't really solve my problem.
PS. Currently I am using this approach https://github.com/sexyoung/laravel-react-webpack
#UPDATE
Well I think I have found an acceptable solution. I will stick with webpack server configuration that I have provided in my question (it is really great cause you can use hot reload + you api calls are redirected to backend port, so instead of localhost:8080/api/user... you call /api/user/1), but I have also developed a simple elixir extension that compiles php to simple static HTML page which solves the problem of editing two index files (we all know programmers are lazy).
var php2html = require("gulp-php2html");
var gulp = require("gulp");
var rename = require("gulp-rename");
var Task = elixir.Task;
elixir.extend("php2html", function (message) {
new Task("php2html", function () {
return gulp.src("./resources/views/index.blade.php")
.pipe(php2html())
.pipe(rename('index.html'))
.pipe(gulp.dest("./"));
})
.watch("resources/views/index.blade.php");
});
elixir(function (mix) {
mix.sass('app.scss');
mix.php2html();
});
So at the moment I have two index files:
index.blade.php in resources/views which is resolved by the router on production
index.html in root of my application folder which is used by webpack-dev-server for development
and of course now these files are sync cause of gulp watch :)
If there is any better approach let me know guys.
I have usually solved this problem (duplicated index.html/php file) using this webpack plugin: https://github.com/ampedandwired/html-webpack-plugin
The idea is the opposite of yours I think. Instead of compiling your php files into static html, you can use the HtmlWebpack plugin to output a index.tmpl.php file (or whatever extension you need) using the filename configuration option. Normally I set that path to be the templates folder of my application server.
I believe this approach is generally easier than doing the other way round.
Using this plugin has other benefits, such as automatic bundle script tags injection depending on your Webpack output config, and automatic cache-bursting file hash added to the script tag urls.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I want to build an API based application using GO and MongoDB. I'm from Asp.net MVC background. Probably if I make an architecture with MVC web application things to be consider are
Separation of concerns(SoC)
DataModel
BusinessEntities
BusinessServices
Controllers
Dependeny Injection and Unity of Work
Unit Testing
MoQ or nUnit
Integration with UI frameworks
Angularjs or others
RESTful urls that enables SEO
Below architecture could be a solution for my need in MVC based appications
There are resources around the web to build Asp.Net or Java based applications, but I have not find solution to Golang application architecture.
Yes GO is different to C# or Java, but still there are Structs, Interfaces to create reusable code and a generic application architecture.
Consider above points in mind, how we can make a clean and reusable project structure in GO applications and a generic repositories for DB(Mongodb) transactions. Any web resources also a great point to start.
It depends on your own style and rules, in my company, we develop our projects this way:
Configuration is determined by environment variables, so we have a company/envs/project.sh file which has to be evaluated before service (outside the project in the image).
We add a zscripts folder that contains all extra scripts, like adding users or publishing a post. Intended to be used only for debug proposes.
Data models (entities) are in a package called project/models.
All controllers and views (HTML templates) are categorized into "apps" or "modules". We use the REST path as main group delimiter, so path /dogs goes to package project/apps/dogs and /cats to project/apps/cats.
Managers are in their separated package at project's root project/manager.
Static files (.css, .png, .js, etc.) are located at project/static/[app/]. Sometimes is required to have the optional [app/] folder, but it only happens when two apps have dashboards or conflicting file names. Most of cases you won't need to use [app/] for static resources.
Managers
We call a manager, a package that contains pure functions which helps apps to perform its task, for example, databases, cache, S3 storage, etc. We initialize each manager calling package.Startup() before we start to listen, and finalize calling package.Finalize() when program is interrupted.
An example of a manager could be project/cache/cache.go:
type Config struct {
RedisURL string `envconfig:"redis_url"`
}
var config Config
var client *redis.Client
func Startup(c Config) error {
config = c
client, err := redis.Dial(c.RedisURL)
return err
}
func Set(k,v string) error {
return client.Set(k, v)
}
in main.go (or your_thing_test.go):
var spec cache.Config
envconfig.Process("project", &spec)
cache.Startup(spec)
And in a app (or module):
func SetCacheHandler(_ http.ResponseWriter, _ *http.Request){
cache.Set("this", "rocks")
}
Modules
A module is a container of views and controllers that are isolated from other modules, using our configuration I would recommend to not create dependencies between modules. Modules are also called apps.
Each module configures its routes using a router, sub-router or what your framework provides, for example (file project/apps/dogs/configure.go):
func Configure(e *echo.Echo) {
e.Get("/dogs", List)
}
Then, all handlers live in project/apps/dogs/handlers.go:
// List outputs a dog list of all stored specimen.
func List(c *echo.Context) error {
// Note the use of models.Xyz
var res := make([]models.Dog, 0) // A little trick to not return nil.
err := store.FindAll("dogs", nil, &res) // Call manager to find all dogs.
// handle error ...
return c.JSON(200, res) // Output the dogs.
}
Finally you configure the app in main (or in a test):
e := echo.New()
dogs.Configure(e)
// more apps
e.Run(":8080")
Note: for views, you can add them to project/apps/<name>/views folder and configure them the using the same function.
Other
Sometimes we also add a project/constants and a project/utils package.
Here is what it looks like:
Note that in above sample, templates are separated from apps, thats because its a placeholder, directory is empty.
Hope it was useful. Greetings from México :D.
I've also struggled about how to structure my Go web APIs in the past and don't know any web resources that tell you exactly how to write a Go web API.
What I did was just check out other projects on Github and try out how they structured their code, for example, the Docker repo has very idomatic Go code on it's API.
Also, Beego is a RESTful framework that generates the project structure for you in a MVC way and according to their docs, it can also be used for APIs.
I've been building a web APIs in golang for a little while now.
You'll have to do some research but I can give you some starting points:
Building Web Apps with Go -- ebook
github.com/julienschmidt/httprouter -- for routing addresses
github.com/unrolled/render/ -- for rendering various forms of responses(JSON, HTML, etc..)
github.com/dgrijalva/jwt-go -- JSON Web Tokens
www.gorillatoolkit.org/pkg/sessions -- Session Management
And for reference on how some things work together in the end:
Go Web API Repo -- personal project
1. Separation of concerns (SoC)
I haven't worked with SoC directly, but I have my own pattern. You can adapt to whatever pattern (MVC, your own, etc.).
In my code, I separate my code into different packages:
myprojectname (package main) — Holds the very basic setup and configuration/project consts
* handlers (package handlers) — Holds the code that does the raw HTTP work
* models (package models) — Holds the models
* apis (NOT a package)
- redis (package redis) — Holds the code that wraps a `sync.Pool`
- twilio (package twilio) — Example of layer to deal with external API
Notes:
In each package other than main, I have a Setup() function (with relevant arguments) that is called by the main package.
For the packages under the apis folder, they are often just to initialize external Go libraries. You also can directly import existing libraries into the handlers/models without an apis package.
I setup my mux as an exported global in the handlers package like this...
Router := mux.NewRouter()
...and then create a file for each URL (different methods with the same URL are in the same file). In each file, I use Go's init() function, which is ran after global variables are initialized (so it's safe to use the router) but before main() is ran (so it's safe for main to assume everything has been setup). The great thing about init() is that you can have as many of those methods as you want in a single package, so they automatically get ran when the package is imported.
Main then imports myprojectname/handlers and then serves the handlers.Router in main.
2. Dependency Injection and Unity of Work
I haven't worked with Unity of Work, so I have no idea of possible Go implementations.
For DI, I build an interface that both the real object and the mock objects will implement.
In the package, I add this to the root:
var DatabaseController DatabaseControllerInterface = DefaultController
Then, the first line of each test I can change DatabaseController to whatever that test needs. When not testing, the unit tests should not be ran, and it defaults to DefaultController.
3. Unit Testing
Go provides a built in testing with the go test package command. You can use go test --cover to also emit a coverage percent. You can even have coverage displayed in your browser, highlighting the parts that are/aren't covered.
I use the testify/assert package to help me with testing where the standard library falls short:
// something_test.go
//
// The _test signifies it should only be compiled into a test
// Name the file whatever you want, but if it's testing code
// in a single file, I like to do filename_test.go.
package main
import (
"testing"
"github.com/stretchr/testify/assert"
)
func TestMath(t *testing.T) {
assert.Equal(t, 3+1, 4)
}
4. Integration with UI frameworks
I haven't seen any for Angular. Although I haven't used it, Go has a good template engine built into the standard lib.
5. RESTful URLs that enables SEO
Again, I can't help you here. It's up to you: send proper status codes (don't send a 200 with a 404 page as you'll get docked for duplicate pages), don't duplicate pages (pay attention to google.com/something vs google.com/something/; hopefully your framework will not mess this up), don't try to trick the search engine, and so on.
To my mind Go webapp project folder on production server can looks like on your picture just much simpler. Nothing special in assets structure - Static, Templates, Content, Styles, Img, JSlibs, DBscripts etc. usual folders. Nothing special in WebAPI - as usual you design which URI will respond required functionality and route requests to handlers accordingly. Some specifics - many gophers don't believe in MVC architecture, it's up to you surely. And you deploy one staticaly linked executable without dependencies. In your development environment you structure yours and imported/vendored sorce files in $GOPATH as in stdlib done but deploy only one executable in production environment, sure with static assets needed. You can see how to orginize Go source packages just in stdlib. Having just one executable what would you structure on production?
I am currently working on a ASP.NET MVC 3 project and I am setting up the solution file on VS2010.
I am not sure of what is the standard approach. I am using the following approach
Company.Dept.Data (contains the dbml file - Data Model)
Company.Dept.Business (Business logics)
Company.Dept.Web (contains ASP.NET MVC3 webapplication)
The first two are class libraries and the last one is MVC3 web application.
Anyother recommendations?
There is no single "standard" approach. It all depends on your project and what problems you are trying to solve with the software. Your proposed structure of having 2 class libraries and 1 web project is one way to go for sure.
If you are going to do any kind of Dependency Injection using an Inversion of Control container, you might also want to consider having an "API" project for interfaces and an "Impl(ementation)" project for concrete classes that fulfill the interface contracts.
To echo danludwig, there really is no standard. I prefer breaking up libraries and namespaces according to functionality. Company.Db is my library for interacting with the database, Company.Mail are my wrappers around the Postmark mail service, etc.
I then tend to group like libraries into single repositories. So the 'storage' repository in source control holds Company.Db, Company.Caching, Company.FileStorage, etc. I have another repository 'messaging' that holds Company.Mail and Company.SMS (for interacting with Twilio to send text messages). When I branch out with new apps or new services (maybe a WCF endpoint for mobile clients), I can just pull down the 'messaging' repository, and I have all my class libraries for communicating with the user.
An application then looks like
Company.Application.Webite
\Libraries\Messaging
\Libraries\Messaging\Company.Mail
\Libraries\Storage
\Libraries\Messaging\Company.Db
\Libraries\Messaging\Company.Caching
\Libraries\Web
...
Company.Application.Wcf
\Libraries\Messaging
\Libraries\Storage
\Libraries\Messaging\Company.Db
\Libraries\Messaging\Company.Caching
...
This way, whether someone registers via the site, or via the mobile app, Company.Mail.MailServices.SendWelcomeEmail() sends the exact same welcome email, and there's no code duplication.
Whether this works for you, or even makes sense, who knows. I've also changed this scheme a hundred times, trying to find a layout that works with my development style/workflow. I wouldn't worry or stress too much about it, because whatever you pick, you're going to find things you like about it, and you'll find things you hate about it. I sometimes fall into the trap of spending more time trying to make everything "perfect", than to just code and change things I don't like.