Skip to main content

T4, Generating interface automatically based on provided classes

With new techniques and patterns interface plays a key role in application architecture. Interface makes application extendable like defining file upload interface and implementing based on file system, Azure Blob storage, Amazon S3. At starting we might be implementing based on Azure Blob but later we might move to Windows based file system and so on.

Ideally we create interface based on need and start implementing actual default implementation class. Many a times at starting of implementation there is one to one mapping between Interface and Class. Like from above example File upload interface and the initial or default class implementation that we design and with time it will get extended.

In this article, we will try to create interface based on default class implementation. This is not at all recommended in Test Driven Design (TDD) where we test the application before actual code implementation but I feel sometimes and in some situations it is okay do that and test straight after creation. Like couple of  months back, I had written an article on Dependency Injection, Repository Pattern and other related patterns (http://vikutech.blogspot.com/2015/01/architecture-solution-composting-repository-pattern-unit-of-work-dependency-injection-factory-pattern.html) where we had created repository interface and then created actual classes. If we have lot of database models or frequent additions in functions then we need to do copy paste of function implementation on classes based on addition in interface.

In this way we can develop application little faster where application demands frequent additions of function. This example is based on upon the repository classes from above link. The idea is to generate interfaces automatically based on repository classes modification. This technique can be found with various tools like Re-sharper, Telerik JustCode etc to refactor codes by extracting interface from class but over here it is through T4.

Resources used 
I have took help of two libraries to read project files and to create multiple files.
MultiOutput.ttinclude (https://github.com/subsonic/SubSonic-3.0-Templates/blob/master/ActiveRecord/MultiOutput.ttinclude) : This is to generate multiple files through single T4 file.
VisualStudioAutomationHelper.ttinclude (https://github.com/PombeirP/T4Factories/blob/master/T4Factories.Testbed/CodeTemplates/VisualStudioAutomationHelper.ttinclude) : This will help us in reading files from different project.
NOTE: Please change path of these files based on path in your project.
 <#@ template debug="true" hostSpecific="true" #>  
 <#@ output extension=".cs" #>  
 <#@ Assembly name="EnvDTE" #>  
 <#@ Assembly name="EnvDTE80" #>  
 <#@ Assembly name="System.ComponentModel.DataAnnotations" #>  
 <#@ import namespace="EnvDTE" #>  
 <#@ import namespace="EnvDTE80" #>  
 <#@ import namespace="System" #>  
 <#@ import namespace="System.Linq" #>  
 <#@ import namespace="System.Xml" #>  
 <#@ import namespace="System.Collections" #>  
 <#@ import namespace="System.Collections.Generic" #>  
 <#@ import namespace="System.ComponentModel.DataAnnotations" #>  
 <#@ import namespace="System.Text.RegularExpressions" #>  
 <#@ include file="T4Plugin/VisualStudioAutomationHelper.ttinclude" #>  
 <#@ include file="T4Plugin/MultiOutput.tt" #>  
 <#  
   var dbProjectNamespace = "MyProject.DB";  
   var modelProjectNamespace = "MyProject.Model";  
   var repositoryNamespace = dbProjectNamespace + ".Repository";  
   var domainModelNamespace = modelProjectNamespace + ".DomainModel";  
   var repositoryInterfaceNamespace = "MyProject.Interface.Repository";  
   // Get a reference to the current project.  
   var dbProject = VisualStudioHelper.GetProject(dbProjectNamespace);  
   // Database model project  
   var modelProject = VisualStudioHelper.GetProject(modelProjectNamespace);  
   // Get all class items from the code model  
   var allRepoClasses = VisualStudioHelper.  
     GetAllCodeElementsOfType(dbProject.CodeModel.CodeElements, EnvDTE.vsCMElement.vsCMElementClass, false)  
     .Where(model => model.FullName.StartsWith(repositoryNamespace));  
   var allModelClasses = VisualStudioHelper.GetAllCodeElementsOfType(modelProject.CodeModel.CodeElements,   
             EnvDTE.vsCMElement.vsCMElementClass, false);  
   // Iterate all database models   
   foreach(CodeClass2 modelClass in allModelClasses  
     .OfType<CodeClass>().Where(clas => clas.FullName.StartsWith(domainModelNamespace) &&   
     !clas.FullName.EndsWith("MetadataSource")  
     ).OrderBy(clas => clas.FullName))  
   {  
     var fileName = "I" + modelClass.Name + "Repository.Generated.cs";  
     // Replace model namespace with repository namespace  
     var nameSpace = modelClass.Namespace.Name.Replace(domainModelNamespace, repositoryInterfaceNamespace);  
     #>  
 namespace <#= nameSpace #>  
 {  
   //------------------------------------------------------------------------------  
   // <auto-generated>  
   //   This code was generated from a template and will be re-created if deleted  
   //       with default values if executed.  
   // </auto-generated>  
   //------------------------------------------------------------------------------  
   using <#= modelClass.Namespace.Name #>;  
   using System;  
   /// <summary>  
   /// Interface to interact with <see cref="<#= modelClass.FullName#>"/> domain model.  
   /// </summary>  
   public partial interface I<#= modelClass.Name#>Repository  
        : IRepository<<#= modelClass.Name#>>  
   {  
 <#  
     // Repository classes  
     foreach(var repoClass in allRepoClasses.OfType<CodeClass2>().  
       Where(model => model.Name == modelClass.Name + "Repository"))  
     {  
       foreach(var partialClas in repoClass.PartialClasses.OfType<CodeClass2>())  
       {  
         var allFunctions = VisualStudioHelper.GetAllCodeElementsOfType(partialClas.Members, EnvDTE.vsCMElement.vsCMElementFunction, false);  
         foreach(var func in allFunctions.OfType<CodeFunction2>()  
           .Where(fun => fun.Name != modelClass.Name + "Repository"))  
         {  
           string strDoc=String.Empty;  
           if(!string.IsNullOrEmpty(func.DocComment) &&   
             func.Name != modelClass.Name + "Repository"){  
             var lines = func.DocComment.Split('\n');  
             for(int idx = 1; idx < (lines.Length-1); idx++) {  
                 #>   /// <#= lines[idx] #> <#  
             }  
           }  
                      #>  
     <#= GenerateFunctionStub(func) #>  
  <#  
         }  
       }  
     }      
     #>  
   }  
 }  
 <#  
     SaveOutput(fileName);  
     DeleteOldOutputs();  
   }  
 #>  
 <#+  
   private string GenerateFunctionStub(CodeFunction2 func)  
   {  
     var parametrs = new StringBuilder();  
     foreach (var item in func.Parameters.OfType<CodeParameter2>()) {  
       if ((parametrs.Length > 0)) {  
         parametrs.Append(", ");  
       }  
       // TODO: Implement other parameter kind  
       switch (item.ParameterKind)  
       {  
         case vsCMParameterKind.vsCMParameterKindOut:  
           parametrs.Append("out ");  
           break;  
         case vsCMParameterKind.vsCMParameterKindRef:  
           parametrs.Append("ref ");  
           break;  
         default:  
           break;  
       }  
       parametrs.AppendFormat("{0} {1}",  
           (item.Type.AsFullName.StartsWith("System") ?  
           "global::" + (item.Type.AsFullName.StartsWith("System.Nullable<System")?  
                   item.Type.AsFullName.Replace("System.Nullable<System", "System.Nullable<global::System") :  
                   item.Type.AsFullName  
                   )  
           : item.Type.AsFullName)  
           , item.FullName);  
     }  
     // Build up the line from the function  
     var funcBody = new StringBuilder();  
     funcBody.AppendFormat("{0}{1} {2}({3})",   
       (func.Type.AsFullName.StartsWith("System") ? "global::" +  
       func.Type.AsFullName.Replace("<System","<global::System").Replace(",System", ",global::System")  
       :func.Type.AsFullName)  
       ,string.IsNullOrEmpty(func.Type.AsFullName) ? "void" : ""  
       , func.Name, parametrs);  
     if (func.FunctionKind == EnvDTE.vsCMFunction.vsCMFunctionConstant) {  
       funcBody.Append(" const");  
     }  
     funcBody.Append(";");  
     return funcBody.ToString();  
   }  
   public string GetXmlComment(string xmlDocComment, int tabAddition)  
   {  
     if(!string.IsNullOrEmpty(xmlDocComment)){  
       return string.Empty;  
     }  
     var comment = new StringBuilder();  
     string appender = string.Empty;  
     for(int tabCtr = 0; tabCtr < tabAddition; tabCtr++){  
       appender += "\t";  
     }  
     var lines = xmlDocComment.Split('\n');  
     for(int ctrLine = 1; ctrLine < (lines.Length-1); ctrLine++){  
       comment.AppendFormat("{0} {1}", appender, lines[ctrLine]);  
     }  
     return comment.ToString();  
   }  
  #>  

var dbProjectNamespace = "MyProject.DB": Database layer project name.
var modelProjectNamespace = "MyProject.Model": Database model project name.
var repositoryNamespace = dbProjectNamespace + ".Repository": Repository namespace from where actual class files would be read to generate interface.
var domainModelNamespace = modelProjectNamespace + ".DomainModel": The database model classes for generating interfaces. Based on these files interfaces are generated for model.
   var repositoryInterfaceNamespace = "MyProject.Interface.Repository": The actual interface repository namespace.

This T4 can generate interfaces, function with written XML comments via repository classes.

Similarly in this way we can created repository classes automatically, unit of work properties and many more.

Comments

Popular posts from this blog

Elegantly dealing with TimeZones in MVC Core / WebApi

In any new application handling TimeZone/DateTime is mostly least priority and generally, if someone is concerned then it would be handled by using DateTime.UtcNow on codes while creating current dates and converting incoming Date to UTC to save on servers. Basically, the process is followed by saving DateTime to UTC format in a database and keep converting data to native format based on user region or single region in the application's presentation layer. The above is tedious work and have to be followed religiously. If any developer misses out the manual conversion, then that area of code/view would not work. With newer frameworks, there are flexible ways to deal/intercept incoming or outgoing calls to simplify conversion of TimeZones. These are steps/process to achieve it. 1. Central code for storing user's state about TimeZone. Also, central code for conversion logic based on TimeZones. 2. Dependency injection for the above class to be able to use global

Using Redis distributed cache in dotnet core with helper extension methods

Redis cache is out process cache provider for a distributed environment. It is popular in Azure Cloud solution, but it also has a standalone application to operate upon in case of small enterprises application. How to install Redis Cache on a local machine? Redis can be used as a local cache server too on our local machines. At first install, Chocolatey https://chocolatey.org/ , to make installation of Redis easy. Also, the version under Chocolatey supports more commands and compatible with Official Cache package from Microsoft. After Chocolatey installation hit choco install redis-64 . Once the installation is done, we can start the server by running redis-server . Distributed Cache package and registration dotnet core provides IDistributedCache interface which can be overrided with our own implementation. That is one of the beauties of dotnet core, having DI implementation at heart of framework. There is already nuget package available to override IDistributedCache i

Handling JSON DateTime format on Asp.Net Core

This is a very simple trick to handle JSON date format on AspNet Core by global settings. This can be applicable for the older version as well. In a newer version by default, .Net depends upon Newtonsoft to process any JSON data. Newtonsoft depends upon Newtonsoft.Json.Converters.IsoDateTimeConverter class for processing date which in turns adds timezone for JSON data format. There is a global setting available for same that can be adjusted according to requirement. So, for example, we want to set default formatting to US format, we just need this code. services.AddMvc() .AddJsonOptions(options => { options.SerializerSettings.DateTimeZoneHandling = "MM/dd/yyyy HH:mm:ss"; });

Making FluentValidation compatible with Swagger including Enum or fixed List support

FluentValidation is not directly compatible with Swagger API to validate models. But they do provide an interface through which we can compose Swagger validation manually. That means we look under FluentValidation validators and compose Swagger validator properties to make it compatible. More of all mapping by reading information from FluentValidation and setting it to Swagger Model Schema. These can be done on any custom validation from FluentValidation too just that proper schema property has to be available from Swagger. Custom validation from Enum/List values on FluentValidation using FluentValidation.Validators; using System.Collections.Generic; using System.Linq; using static System.String; /// <summary> /// Validator as per list of items. /// </summary> /// <seealso cref="PropertyValidator" /> public class FixedListValidator : PropertyValidator { /// <summary> /// Gets the valid items /// <

Kendo MVC Grid DataSourceRequest with AutoMapper

Kendo Grid does not work directly with AutoMapper but could be managed by simple trick using mapping through ToDataSourceResult. The solution works fine until different filters are applied. The problems occurs because passed filters refer to view model properties where as database model properties are required after AutoMapper is implemented. So, the plan is to intercept DataSourceRequest  and modify names based on database model. To do that we are going to create implementation of  CustomModelBinderAttribute to catch calls and have our own implementation of DataSourceRequestAttribute from Kendo MVC. I will be using same source code from Kendo but will replace column names for different criteria for sort, filters, group etc. Let's first look into how that will be implemented. public ActionResult GetRoles([MyDataSourceRequest(GridId.RolesUserGrid)] DataSourceRequest request) { if (request == null) { throw new ArgumentNullExce

Kendo MVC Grid DataSourceRequest with AutoMapper - Advance

The actual process to make DataSourceRequest compatible with AutoMapper was explained in my previous post  Kendo MVC Grid DataSourceRequest with AutoMapper , where we had created custom model binder attribute and in that property names were changed as data models. In this post we will be looking into using AutoMapper's Queryable extension to retrieve the results based on selected columns. When  Mapper.Map<RoleViewModel>(data)  is called it retrieves all column values from table. The Queryable extension provides a way to retrieve only selected columns from table. In this particular case based on properties of  RoleViewModel . The previous approach that we implemented is perfect as far as this article ( 3 Tips for Using Telerik Data Access and AutoMapper ) is concern about performance where it states: While this functionality allows you avoid writing explicit projection in to your LINQ query it has the same fatal flaw as doing so - it prevents the query result from

Data seed for the application with EF, MongoDB or any other ORM.

Most of ORMs has moved to Code first approach where everything is derived/initialized from codes rather than DB side. In this situation, it is better to set data through codes only. We would be looking through simple technique where we would be Seeding data through Codes. I would be using UnitOfWork and Repository pattern for implementing Data Seeding technique. This can be applied to any data source MongoDB, EF, or any other ORM or DB. Things we would be doing. - Creating a base class for easy usage. - Interface for Seed function for any future enhancements. - Individual seed classes. - Configuration to call all seeds. - AspNet core configuration to Seed data through Seed configuration. Creating a base class for easy usage public abstract class BaseSeed<TModel> where TModel : class { protected readonly IMyProjectUnitOfWork MyProjectUnitOfWork; public BaseSeed(IMyProjectUnitOfWork MyProjectUnitOfWork) { MyProject

Trim text in MVC Core through Model Binder

Trimming text can be done on client side codes, but I believe it is most suitable on MVC Model Binder since it would be at one place on infrastructure level which would be free from any manual intervention of developer. This would allow every post request to be processed and converted to a trimmed string. Let us start by creating Model binder using Microsoft.AspNetCore.Mvc.ModelBinding; using System; using System.Threading.Tasks; public class TrimmingModelBinder : IModelBinder { private readonly IModelBinder FallbackBinder; public TrimmingModelBinder(IModelBinder fallbackBinder) { FallbackBinder = fallbackBinder ?? throw new ArgumentNullException(nameof(fallbackBinder)); } public Task BindModelAsync(ModelBindingContext bindingContext) { if (bindingContext == null) { throw new ArgumentNullException(nameof(bindingContext)); } var valueProviderResult = bindingContext.ValueProvider.GetValue(bin

Storing and restoring Kendo Grid state from Database

There is no any built in way to store entire grid state into database and restore back again with all filters, groups, aggregates, page and page size. At first, I was trying to restore only filters by looking through DataSourceRequest. DataSourceRequest is kind of communication medium between client and server for the operation we do on grid. All the request comes via DataSourceRequest. In previous approach, I was trying to store IFileDescriptor interface which come with class FileDescriptor by looping through filters and serializing into string for saving into database but this IFileDescriptor can also contain CompositeFilterDescriptor which can be nested in nested object which are very tricky to handle. So, I had decompiled entire Kendo.MVC library and found out that all Kendo MVC controls are derived from “JsonObject”. It is there own implementation with ”Serialize” abstract function and “ToJson” function. In controls they are overriding “Serialize” method which depicts t

OpenId Authentication with AspNet Identity Core

This is a very simple trick to make AspNet Identity work with OpenId Authentication. More of all both approach is completely separate to each other, there is no any connecting point. I am using  Microsoft.AspNetCore.Authentication.OpenIdConnect  package to configure but it should work with any other. Configuring under Startup.cs with IAppBuilder app.UseCookieAuthentication(new CookieAuthenticationOptions { AuthenticationScheme = CookieAuthenticationDefaults.AuthenticationScheme, LoginPath = new PathString("/Account/Login"), CookieName = "MyProjectName", }) .UseIdentity() .UseOpenIdConnectAuthentication(new OpenIdConnectOptions { ClientId = "<AzureAdClientId>", Authority = String.Format("https://login.microsoftonline.com/{0}", "<AzureAdTenant>"), ResponseType = OpenIdConnectResponseType.IdToken, PostLogoutRedirectUri = "<my website url>",