T4 to generate Enum from SQL Server table values - visual-studio-2010

What I want to achieve is more or less the inverse of this:
http://www.olegsych.com/2008/07/t4-template-for-generating-sql-view-from-csharp-enumeration/
I have a value group table (enum names) and a values table (enum values), and want to turn those into enums. Both are in SQL Server, and both do happen to be in an .edmx (so there would be quite a few ways to read the values).
Is there something "out there" that already does this (and I didn't find it)? If not, what would be the best way to go about reading the data (SMO, EDMX with dynamic loading, ...)

I've put some more effort into writing such a template so it does all these:
generates enumeration values with explicit integer values;
uses Visual Studio's namespace naming convention so generated enumerations have project's default namespace with any subfolders appended (just like any code file in Visual Studio);
adds complete enumeration XML documentation by using additional description table column values; if you don't have these never mind;
correctly names the generated file and adds an additional attribute in the code so the generated enum doesn't get scrutinised by code analysis;
multi-word lookup table values are correctly concatenated to pascal-cased equivalents (ie. Multi word value becomes a MultiWordValue);
enumeration values always start with a letter;
all enumeration values consist of only letters and numbers, everything else gets cut out;
Anyway. Everything is very well documented in this blog post.

Ok, here's how I implemented it:
Use VolatileAssembly from the T4 Toolbox to reference an assembly that...
Implements a T4 Helper class that does all the database work (when using EF, make sure to use a connection string when instantiating the context)
In the .tt, simply call into the T4 helper class to get the data you need and create your class(es)

Related

jOOQ: Is it possible to generate enums from certain tables given by a name pattern?

Sometimes it is necessary to have tables like:
CREATE TABLE contact_phone_type (
-- PRIMARY KEY
id BIGSERIAL PRIMARY KEY,
-- ATTRIBUTES
name VARCHAR(10) NOT NULL UNIQUE
);
INSERT INTO contact_phone_type
(name)
VALUES
('Phone'),
('Fax');
Which are sometimes annoying to map into enum types in order to have a convenient and typesafe mapping later on. Since those enum types have to be hand written it is sometimes a bit annoying to type the exact same thing a second time. Especially annoying is the case when one changes the ordering. This means that a corresponding email has to be re-ordered by hand as well.
Therefore I am wondering if jOOQs code generator might be able to generate those enums for me instead?
I am aware of this question but my use case if by far not that tricky.
All the generator would have to do is basically look if e.h. the table ends with _type, and if so, create an enum with elements specified e.g. by column name and copy those generated files into a directory I am telling it to.
Is there a chance that this is possible?
Another thing that comes up with those enums is that one has also to write corresponding Converter<>. If the code generator recognizes a "type-table" then it could create the enum and the corresponding converter.
Just a toy example:
private void createDeliveryPhoneNumber(Long shopId, String deliveryPhoneNumber) {
this.ctx
.insertInto(SHOP_CONTACT_PHONE)
.set(SHOP_CONTACT_PHONE.SHOP_ID, shopId)
.set(SHOP_CONTACT_PHONE.PHONE, deliveryPhoneNumber)
.set(SHOP_CONTACT_PHONE.CONTACT_PHONE_TYPE_ID, ContactPhoneType.DELIVERY)
.execute();
}
I am aware of this question but my use case if by far not that tricky.
Apart from the fact that your use-case perception is subjective, and I disagree :), this is equally not available out of the box from jOOQ as the use-case in your linked question (for the same reasons).
It is, however, rather easy to implement this kind of code generation yourself at your end. Either, you can extend the jOOQ code generator to generate additional classes, or you do that in an entirely independent step.

What is the best way to manage a large quantity of constants

I am currently working on a very complex program that processes rows from an input table and has a huge number of possible outcomes for each record. Because of this I have a very large number of constants defined for the outcome messages. There is one success message for the record, but a multitude of possible warnings and errors.
My first thought was to define all of my constants for these messages at the package body level, but then I decided to move each constant to the procedure where it is used. I'm now second guessing that decision and thinking of moving everything back to package body level. What is the best way to define this many constants? Ease of maintainability is my ultimate goal for this program since it is so complex.
I think this is a matter of taste. In my application I put all error codes into an Error-Package. All main and commonly used constants I put into a separate package (without a package body).
Again, a matter of taste, but I tend to put a list of named constants at the package spec level rather than the package body so that they can be referenced by any portion of the application. If I ever want to change the error code that c_err_for_specific_reason_x uses, it becomes a single place to do so.
If I wanted to hide the codes and put them within the body I would have a get_error_code(p_get_error_name varchar) function that did the translation based on you passing a valid constant name.
I've done both on different projects, but tend towards the list over the function most times. I tend to use the function if it a table-driven source of the data.
It ... wait for it ... depends!
Since you currently define your constants in the package body, you don't need them to be publicly accessible outside the package. So defining them in a spec really doesn't buy you anything.
Here's is the rule I follow: Define constants within the smallest scope needed. So if a constant is used only within one procedure, define it in that procedure. If it is used within more than one procedure, define it in the body. If it is used elsewhere by code in other packages (or non-packaged SPs) but only when using a particular package, define it in the spec of that package. If it is used by other code for general use, put it in a separate spec of such general constants.

What is the difference between {block} and {include}?

What are the main differences between the {block} tag and the {include} tag? I know they are both used for template inheritance, but does one work faster or allow for more flexibility?
The {include} function simply refers to another template file whose contents should be included at that point in the output. It is not related to any kind of inheritance, and works like a cross between PHP's include/require and a function call, in that you can pass in parameters and variables can have local scope.
The {block} function is used for Template Inheritance. While the effects could be simulated by clever use of sub-templates, the fundamental idea is very different. As explained in the documentation, a parent template can have a number of named blocks, and a child template can over-ride any or all of these, referencing them by name, with the remainder of the code coming directly from the parent template.
One way of thinking about it would be that {include} is useful if you have sections of content you want to include into multiple page structures, whereas Template Inheritance would be more appropriate if you want many pages with similar structure, but with different content in certain sections. And of course, you may well want a mixture of both.

Is it possible to exclude entire namespaces from NDepend analysis?

I have a setup where Visual Studio 2010 runs test coverage analysis and it's output is absorbed by NDepend during an integration build.
A few assemblies contain generated code that needs to be ignored by NDepend.
Is there a way to do this? Preferably an entire namespace.
Code Query and Rule over LINQ (CQLinq) indeed provides a facility to ignore generated code.
There is the convenient predefined domain named JustMyCode of type ICodeBaseView.
The domain JustMyCode represents a facility of CQLinq to eliminate generated code elements from CQLinq query results. For example the following query will only match large methods that are not generated by a tool (like a UI designer):
from m in JustMyCode.Methods where m.NbLinesOfCode > 30 select m
The set of generated code elements is defined by CQLinq queries prefixed with the CQLinq keyword notmycode. For example the query below matches methods defined in source files whose name ends up with ".designer.cs":
notmycode from m in Methods where
m.SourceFileDeclAvailable &&
m.SourceDecls.First().SourceFile.FileName.ToLower().EndsWith(".designer.cs")
select m
The CQLinq queries runner executes all notmycode queries before queries relying on JustMyCode, hence the domain JustMyCode is defined once for all. Obviously the CQLinq compiler emits an error if a notmycode query relies on the JustMyCode domain.
There are 4 default notmycode queries, easily adaptable to match your need. Notes that there is no default notmycode queries for namespaces but you can create your own one(s):
Discard generated Assemblies from JustMyCode
Discard generated Types from JustMyCode
Discard generated and designer Methods from JustMyCode
Discard generated Fields from JustMyCode
Found this in the "Quick summary of methods to refactor"
// Here are some ways to avoid taking account of generated methods.
!( NameIs "InitializeComponent()" OR
// NDepend.CQL.GeneratedAttribute is defined in
// the redistributable assembly $NDependInstallDir$\Lib\NDepend.CQL.dll
// You can define your own attribute to mark "Generated".
HasAttribute "OPTIONAL:NDepend.CQL.GeneratedAttribute")
But that doesn't address the need to modify every CQL query to ensure they all ignore the generated code.

Source code for specific stored procedure or function

I can use all_arguments and all_procedures to list the procedures and functions inside any given package and with DBMS_METADATA I can extract the DDL for that package. Is there an easy way (other than lots of instring and substring calls) to obtain the procedure or function source code separately for each separate block of code in a package.
Something like this:
Owner | Package Name | Object Name | Overload | Arguments | Source
Obviously using substring and instring will present issues with overloaded functions.
All_arguments has the subprogram_id field which according to the very sparse documentation on it looks like it does uniquely reference which procedure it related to in the package but there doesn't appear to be anything that uses it.
Cheers in advance
IIRC, PLSQL allows nested packages and functions. In this case, you'll find that "instring" and "substring" may not be adequate to extract the source code, as you're facing recursion, and string functions typically only handle a smaller class of computations (typically regular expressions). This is a classic problem people have trying to parse languages with simple string manipulation. You can get around limits of string functions by essentially hacking to produce a poor man's parser but this can be a surprising amount of work if you want it to be deadly right, because you have to handle at least the recursive grammar rules that matter for your extraction.
Another way to get reliable access to the elements of a PLSQL package is to use a language parser. The DMS Software Reengineering Toolkit has a full PLSQL parser.
You'd have to extract the package text to a file first, and then apply the PLSQL parser to it; that produces an abstract syntax tree (AST) internally in the parser. Given the name of a function, it is rather easy to search the AST for the function with a matching name. You'd end up with more than one hit if you have overloaded functions; you might qualify the function by the hierarchy in which it is embedded or the information about the arguments that you might have. Having identified a specific function in the AST, one can ask DMS to pretty-print that tree, and it will regenerate the text of (complete with comments) for that function.

Resources