Telerik Products now easily available in India via India Distributor

Diwali has just passed and the festivities are dialling down. Diwali has brought changes for Telerik in India and now I write this as an evangelist for Telerik/ Progress. In this blog post, I will list the changes and attempt to clarify if it has any impact on our community:

Telerik (a part of Progress) has changed its operations structurally in India. Indian Rupee billing was number one request from businesses who found USD billing to a US entity time taking and troublesome.

Telerik products will now be sold in India in Indian Rupees only via a distributor.While this does not affect you as a developer, your accounts team and business managers would love this change due to the following:

  1. A lot less hassle in making the payments done. Earlier, they had to get multiple forms (Form 15 CA & Form 15 CB) signed from a Chartered Accounts and make the payment via a wire transfer through the bank. This could take over 4-5 days to make a single payment and accrue additional charges for the wire transfer. Now, it is a simple NEFT or a UPI transfer.
  2. Lowering cost of projects to the customers. Earlier products like Telerik DevCraft couldn’t be used to offset indirect tax liability. With the new GST regime, your accounts team can offset the tax on their project lowering the cost of the implementation for their customers in India.

Isn’t this just great?

GTM final logo_512The new distributor is GTM Catalyst for the India market. What is even better is that I will be handling this new distributor organisation. This will bring the familiarity and continuity of business for you.

You may reach GTM Catalyst team for any questions or comments at: info@gtmcatalyst.com

To allay any questions in your mind, all the Telerik products continue to be available for sales and are fully supported by Telerik/ Progress. All products are moving full steam ahead including DevCraft, Kendo UI, Telerik Platform, Telerik Reporting, Sitefinity and Test Studio. Telerik (a Progress company) will continue to release new updates on a rigorous pace and continue to provide you with benefits of the latest technologies.

The latest webinar for R3 release for Telerik controls is now available here: https://www.youtube.com/watch?time_continue=10&v=sxv_7RnOwVI 

Lastly, we are eager to continue our India webinar series to share our learnings with you. We will be sending the webinar schedule to you shortly with an update here.

Remember to stay tuned here!

Hierarchical structure using Telerik TreeView in ASP.NET MVC

One of our customers required hierarchical UI to be implemented. The data was residing in an API and required remote data binding. This task is very easy to implement with Telerik TreeView from Telerik UI for ASP.NET MVC suite.

A TreeView component represents hierarchical data in a tree structure. It allows users to perform single or multiple selection of items, drag and drop of elements within the TreeView.

The Telerik UI for ASP.NET MVC TreeView component comes with built-in checkbox support, keyboard navigation, RTL support, accessibility and provides templates for complete customization of each node. You can bind the TreeView to various data sources and take advantage of its load on demand feature, and request data only when a node is expanded.

Let’s see how we can use Telerik TreeView control to implement heirarchial structures:

  1. Sample Database and Table Using below script

For hierarchical structure we need to do one to many relation in tables, but in a below example, we are doing relation one-to-many from the Products table to itself (in a single table). In the table, EmployeeId is primary key and ReportsTo is foreign key. See the highlighted lines below:

CREATE DATABASE [Sample]

GO
USE [Sample]
GO
/****** Object:  Table [dbo].[Employees]    Script Date: 2/16/2021 1:34:48 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Employees](
	[EmployeeID] [int] IDENTITY(1,1) NOT NULL,
	[LastName] [nvarchar](20) NOT NULL,
	[FirstName] [nvarchar](10) NOT NULL,
	[Title] [nvarchar](30) NULL,
	[TitleOfCourtesy] [nvarchar](25) NULL,
	[BirthDate] [datetime] NULL,
	[HireDate] [datetime] NULL,
	[Address] [nvarchar](60) NULL,
	[City] [nvarchar](15) NULL,
	[Region] [nvarchar](15) NULL,
	[PostalCode] [nvarchar](10) NULL,
	[Country] [nvarchar](15) NULL,
	[HomePhone] [nvarchar](24) NULL,
	[Extension] [nvarchar](4) NULL,
	[ReportsTo] [int] NULL,
 CONSTRAINT [PK_Employees] PRIMARY KEY CLUSTERED 
(
	[EmployeeID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
SET IDENTITY_INSERT [dbo].[Employees] ON 

INSERT [dbo].[Employees] ([EmployeeID], [LastName], [FirstName], [Title], [TitleOfCourtesy], [BirthDate], [HireDate], [Address], [City], [Region], [PostalCode], [Country], [HomePhone], [Extension], [ReportsTo]) VALUES (1, N'Davolio', N'Nancy', N'Sales Representative', N'Ms.', CAST(N'1948-12-08T00:00:00.000' AS DateTime), CAST(N'1992-05-01T00:00:00.000' AS DateTime), N'507 - 20th Ave. E.
Apt. 2A', N'Seattle', N'WA', N'98122', N'USA', N'(206) 555-9857', N'5467', 2)

INSERT [dbo].[Employees] ([EmployeeID], [LastName], [FirstName], [Title], [TitleOfCourtesy], [BirthDate], [HireDate], [Address], [City], [Region], [PostalCode], [Country], [HomePhone], [Extension], [ReportsTo]) VALUES (2, N'Fuller', N'Andrew', N'Vice President, Sales', N'Dr.', CAST(N'1952-02-19T00:00:00.000' AS DateTime), CAST(N'1992-08-14T00:00:00.000' AS DateTime), N'908 W. Capital Way', N'Tacoma', N'WA', N'98401', N'USA', N'(206) 555-9482', N'3457', NULL)

INSERT [dbo].[Employees] ([EmployeeID], [LastName], [FirstName], [Title], [TitleOfCourtesy], [BirthDate], [HireDate], [Address], [City], [Region], [PostalCode], [Country], [HomePhone], [Extension], [ReportsTo]) VALUES (3, N'Leverling', N'Janet', N'Sales Representative', N'Ms.', CAST(N'1963-08-30T00:00:00.000' AS DateTime), CAST(N'1992-04-01T00:00:00.000' AS DateTime), N'722 Moss Bay Blvd.', N'Kirkland', N'WA', N'98033', N'USA', N'(206) 555-3412', N'3355', 2)

INSERT [dbo].[Employees] ([EmployeeID], [LastName], [FirstName], [Title], [TitleOfCourtesy], [BirthDate], [HireDate], [Address], [City], [Region], [PostalCode], [Country], [HomePhone], [Extension], [ReportsTo]) VALUES (4, N'Peacock', N'Margaret', N'Sales Representative', N'Mrs.', CAST(N'1937-09-19T00:00:00.000' AS DateTime), CAST(N'1993-05-03T00:00:00.000' AS DateTime), N'4110 Old Redmond Rd.', N'Redmond', N'WA', N'98052', N'USA', N'(206) 555-8122', N'5176', 2)

INSERT [dbo].[Employees] ([EmployeeID], [LastName], [FirstName], [Title], [TitleOfCourtesy], [BirthDate], [HireDate], [Address], [City], [Region], [PostalCode], [Country], [HomePhone], [Extension], [ReportsTo]) VALUES (5, N'Buchanan', N'Steven', N'Sales Manager', N'Mr.', CAST(N'1955-03-04T00:00:00.000' AS DateTime), CAST(N'1993-10-17T00:00:00.000' AS DateTime), N'14 Garrett Hill', N'London', NULL, N'SW1 8JR', N'UK', N'(71) 555-4848', N'3453', 2)

INSERT [dbo].[Employees] ([EmployeeID], [LastName], [FirstName], [Title], [TitleOfCourtesy], [BirthDate], [HireDate], [Address], [City], [Region], [PostalCode], [Country], [HomePhone], [Extension], [ReportsTo]) VALUES (6, N'Suyama', N'Michael', N'Sales Representative', N'Mr.', CAST(N'1963-07-02T00:00:00.000' AS DateTime), CAST(N'1993-10-17T00:00:00.000' AS DateTime), N'Coventry House
Miner Rd.', N'London', NULL, N'EC2 7JR', N'UK', N'(71) 555-7773', N'428', 5)

INSERT [dbo].[Employees] ([EmployeeID], [LastName], [FirstName], [Title], [TitleOfCourtesy], [BirthDate], [HireDate], [Address], [City], [Region], [PostalCode], [Country], [HomePhone], [Extension], [ReportsTo]) VALUES (7, N'King', N'Robert', N'Sales Representative', N'Mr.', CAST(N'1960-05-29T00:00:00.000' AS DateTime), CAST(N'1994-01-02T00:00:00.000' AS DateTime), N'Edgeham Hollow
Winchester Way', N'London', NULL, N'RG1 9SP', N'UK', N'(71) 555-5598', N'465', 5)

INSERT [dbo].[Employees] ([EmployeeID], [LastName], [FirstName], [Title], [TitleOfCourtesy], [BirthDate], [HireDate], [Address], [City], [Region], [PostalCode], [Country], [HomePhone], [Extension], [ReportsTo]) VALUES (8, N'Callahan', N'Laura', N'Inside Sales Coordinator', N'Ms.', CAST(N'1958-01-09T00:00:00.000' AS DateTime), CAST(N'1994-03-05T00:00:00.000' AS DateTime), N'4726 - 11th Ave. N.E.', N'Seattle', N'WA', N'98105', N'USA', N'(206) 555-1189', N'2344', 2)

INSERT [dbo].[Employees] ([EmployeeID], [LastName], [FirstName], [Title], [TitleOfCourtesy], [BirthDate], [HireDate], [Address], [City], [Region], [PostalCode], [Country], [HomePhone], [Extension], [ReportsTo]) VALUES (9, N'Dodsworth', N'Anne', N'Sales Representative', N'Ms.', CAST(N'1966-01-27T00:00:00.000' AS DateTime), CAST(N'1994-11-15T00:00:00.000' AS DateTime), N'7 Houndstooth Rd.', N'London', NULL, N'WG2 7LT', N'UK', N'(71) 555-4444', N'452', 5)

SET IDENTITY_INSERT [dbo].[Employees] OFF
GO

ALTER TABLE [dbo].[Employees]  WITH NOCHECK ADD  CONSTRAINT [FK_Employees_Employees] FOREIGN KEY([ReportsTo])
REFERENCES [dbo].[Employees] ([EmployeeID])
GO

ALTER TABLE [dbo].[Employees] CHECK CONSTRAINT [FK_Employees_Employees]
GO

ALTER TABLE [dbo].[Employees]  WITH NOCHECK ADD  CONSTRAINT [CK_Birthdate] CHECK  (([BirthDate] < getdate()))
GO

ALTER TABLE [dbo].[Employees] CHECK CONSTRAINT [CK_Birthdate]
GO

The Employee table should look like the below:

Next create Entity Module.

After Creation of Entity, Employee class look like below

public partial class Employee
    {
        public Employee()
        {
            this.Employees1 = new HashSet<Employee>();
        }
    
        public int EmployeeID { get; set; }
        public string LastName { get; set; }
        public string FirstName { get; set; }
        public string Title { get; set; }
        public string TitleOfCourtesy { get; set; }
        public Nullable<System.DateTime> BirthDate { get; set; }
        public Nullable<System.DateTime> HireDate { get; set; }
        public string Address { get; set; }
        public string City { get; set; }
        public string Region { get; set; }
        public string PostalCode { get; set; }
        public string Country { get; set; }
        public string HomePhone { get; set; }
        public string Extension { get; set; }
        public Nullable<int> ReportsTo { get; set; }
        public string PhotoPath { get; set; }
    
        public virtual ICollection<Employee> Employees1 { get; set; }
        public virtual Employee Employee1 { get; set; }
    }

2. Now in the controller TreeviewController.cs

Note also, that the hasChildren uses a navigation property generated in the EF model (Employees1). That would be present if you have created a relation one-to-many from the Products table to itself.

public JsonResult Remote_Data_Binding_Get_Employees(int? id)
{			
  using (TelerikEntities entities = new TelerikEntities())
  {
     var data = from e in entities.Employees
     where (id.HasValue ? e.ReportsTo == id : e. ReportsTo == null)
      select new
       {
         id = e.EmployeeID,
         Name = e.FirstName,
         hasChildren = e.Employees1.Any()

       };
      return Json(data.ToList(), JsonRequestBehavior.AllowGet);
   }
}

In the Remote_Data_Binding_Get_Employees action of the TreeviewController.cs you will notice the yellow colored variables:

Note that as the Name field is used in the controller, the same should also be used in the TreeView in index.cstml file and make sure return of data in list.

3. In View index.cshtml

<div class="demo-section k-content">
        @(Html.Kendo().TreeView()
        .Name("treeview")
        .DataTextField("Name")
        .DataSource(dataSource => dataSource
            .Read(read => read
                .Action("Remote_Data_Binding_Get_Employees", "TreeView")
                )
        )
    )
</div>

In a above code we are doing remote data binding in .Action function we are passing “Method name” and “Controller name”.

Here is the hierarchical output we have acheived:-

Make reusable web controls with Angular and Telerik Kendo UI

Angular requires the use of the entire framework for it to work making it a takeover for the entire application being built. Web Components provide a specification by which we make these Angular components available for use with plain simple HTML. It is a web standard for defining new HTML elements in a framework-agnostic way.

Specifically, Angular elements are Angular components packaged as custom elements (also called Web Components).

One of the questions that our customers ask us is when they use Kendo UI is Angular Elements supported? The answer is a resounding yes and we detail a simple step by step to showcase this capability by using Kendo UI charts control:

1. Install Angular CLI and create a new project

npm i -g @angular/cli
ng new angular-custom-elements

2. Activate your Trial or commercial License

Kendo UI for Angular is a professionally developed library distributed under a commercial license. Starting from December 2020, using any of the UI components from the Kendo UI for Angular library requires either a commercial license key or an active trial license key.

After login in your telerik account Download your Telerik license key and Save the kendo-ui-license.txt license key file in the project folder.

Install or Update a License Key

  • Copy the license key file (kendo-ui-license.txt) to the root folder of your project. Alternatively, copy the contents of the file to the KENDO_UI_LICENSE environment variable.
  • Install @progress/kendo-licensing as a project dependency by running npm install --save @progress/kendo-licensing or yarn add @progress/kendo-licensing.
  • Run npx kendo-ui-license activate or yarn run kendo-ui-license activate in the console.

Adding the Kendo UI Components

Kendo UI for Angular is distributed as multiple NPM packages scoped to @progress. For example, the name of the Grid package is @progress/kendo-angular-grid. As of the Angular 6 release, Angular CLI introduces the ng add command which provides for a faster and more user-friendly package installation. For more information, refer to the article on using Kendo UI for Angular with Angular CLI.

3. Let’s start and add the Charts package:

Angular CLI supports the addition of packages through the ng add command which executes in one step the set of otherwise individually needed commands.

ng add @progress/kendo-angular-charts

The command installs all necessary packages, sets up the default theme, and imports the component module. The full set of applied changes can be seen by running git diff at any time.

Manual Setup

All components that you reference during the installation will be present in the final bundle of your application. To avoid ending up with components you do not actually need, either:

  • Import all Charts components at once by using the ChartsModule, or
  • Import a specific Charts component by adding it as an individual NgModule.

Download and install the package.

npm install --save @progress/kendo-angular-charts @progress/kendo-angular-common @progress/kendo-angular-intl @progress/kendo-angular-l10n @progress/kendo-angular-popup @progress/kendo-drawing hammerjs @progress/kendo-licensing

Once installed, import Hammer.js and the NgModule of the components you need.

To get all package components, import the ChartsModule in your [application root]({{ site.data.url.angular[‘ngmodules’] }}#angular-modularity) or feature module in app.module.ts.

3. Add elements package

Custom elements are a Web Platform feature currently supported by Chrome, Edge (Chromium-based), Firefox, Opera, and Safari, and available in other browsers through polyfills

ng add @angular/elements

4. Create a component

ng g component chart --inline-style --inline-template -v None

5. Add properties to the component

5. Update NgModule

6. Building the Angular Project for Production

ng build –prod –output-hashing=none

Now we need to create a build script(angular-elements-build.js) to produce only one JS file from the multiple files generated by the Angular CLI.

You need to install fs-extra and concat from npm using:

npm install fs-extra concat

7. In your root application, create a build script file and add below code, angular-element-build.js

const fs = require('fs-extra');
const concat  = require('concat');
(async function build() {
const files = [
'./dist/chart-custom-element/runtime.js',
'./dist/chart-custom-element/polyfills.js',
'./dist/chart-custom-element/main.js',
]
try{
await fs.ensureDir('angular-elements')
await fs.copy('./dist/chart-custom-element/styles.css','angular-elements/styles.css')
await concat(files,'angular-elements/chart-angular-element.js')
}catch(err){
console.log(err);
}
})()

7. run the script using below command.

node angular-element-build.js

The above command will create an angular-elements folder and chat-angular-element.js and copy styles.css file inside in angular-elements folder

8. Use the Angular Element in simple HTML

Add index.html file in angular-elements folder with below code:

<!DOCTYPE html>
<html lang="en">
<head>
    <base href="/">
    <meta charset="utf-8">
    <meta name="viewport" content="width=device-width, initial-scale=1">
    <link rel="icon" type="image/x-icon" href="favicon.ico">
    <link rel="stylesheet" href="styles.css" />
    <title>Testing our custom chart element</title>
</head>
<body>
    <div class="container">
        New component
        <app-chart></app-chart>
        <script type="text/javascript" src="chart-angular-element.js"></script>
        <script>
            let arr = [2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011];
            let arrseries = [{
                name: 'India',
                data: [3.907, 7.943, 7.848, 9.284, 9.263, 9.801, 3.890, 8.238, 9.552, 6.855]
            }, {
                name: 'Russian Federation',
                data: [4.743, 7.295, 7.175, 6.376, 8.153, 8.535, 5.247, -7.832, 4.3, 4.3]
            }, {
                name: 'Germany',
                data: [0.010, -0.375, 1.161, 0.684, 3.7, 3.269, 1.083, -5.127, 3.690, 2.995]
            }, {
                name: 'World',
                data: [1.988, 2.733, 3.994, 3.464, 4.001, 3.939, 1.333, -2.245, 4.339, 2.727]
            }]
            let title = "New Title"
            let querySelect = document.querySelector('app-chart');
            querySelect.categories = arr;
            querySelect.series = arrseries;
            querySelect.title = title;
        </script>
    </div>
</body>
</html>

Install live-server using below command:

npm install -g live-server

Navigate to your angular-element folder and run below command:

cd angular-element
npx live-server

Browser window will open with the URL http://localhost:8080/

Now you can see the angular component is working outside of the angular application.

Get Started with Kendo UI for Angular

Our customers enjoy building a good UI for their projects. They rely on Kendo UI to deliver an outstanding development experience along with the most popular JS framework of the modern web – Angular. Kendo UI for Angular is a professionally developed library distributed under a commercial license.

Some of them are not sure how to get started with Kendo UI for Angular. In the post below we detail how you can start using Kendo UI with Angular and include a Chart control:

First Step => Setting up the angular project

The easiest way to start with Angular is to use the Angular CLI Tool. To scaffold your project structure, follow its installation instructions.

npm install -g @angular/cli
ng new my-first-angular-project
cd my-first-angular-project

Second Step => Activate your Trial or commercial License

Starting from December 2020, using any of the UI components from the Kendo UI for Angular library requires either a commercial license key or an active trial license key.

After login in your telerik account Download your Telerik license key Next, save the kendo-ui-license.txt license key file in the project folder.

Install or Update a License Key

  • Copy the license key file (kendo-ui-license.txt) to the root folder of your project. Alternatively, copy the contents of the file to the KENDO_UI_LICENSE environment variable.
  • Install @progress/kendo-licensing as a project dependency by running npm install --save @progress/kendo-licensing or yarn add @progress/kendo-licensing.
  • Run npx kendo-ui-license activate or yarn run kendo-ui-license activate in the console.

Adding the Kendo UI Components

Kendo UI for Angular is distributed as multiple NPM packages scoped to @progress. For example, the name of the Grid package is @progress/kendo-angular-grid. As of the Angular 6 release, Angular CLI introduces the ng add command which provides for a faster and more user-friendly package installation. For more information, refer to the article on using Kendo UI for Angular with Angular CLI.

1. Let’s start and add the Charts package:

Angular CLI supports the addition of packages through the ng add command which executes in one step the set of otherwise individually needed commands.

ng add @progress/kendo-angular-charts

The command installs all necessary packages, sets up the default theme, and imports the component module. The full set of applied changes can be seen by running git diff at any time.

Manual Setup

All components that you reference during the installation will be present in the final bundle of your application. To avoid ending up with components you do not actually need, either:

  • Import all Charts components at once by using the ChartsModule, or
  • Import a specific Charts component by adding it as an individual NgModule.

Download and install the package.

npm install --save @progress/kendo-angular-charts @progress/kendo-angular-common @progress/kendo-angular-intl @progress/kendo-angular-l10n @progress/kendo-angular-popup @progress/kendo-drawing hammerjs @progress/kendo-licensing

Once installed, import Hammer.js and the NgModule of the components you need.

To get all package components, import the ChartsModule in your [application root]({{ site.data.url.angular[‘ngmodules’] }}#angular-modularity) or feature module in app.module.ts.

    import { NgModule } from '@angular/core';
    import { BrowserModule } from '@angular/platform-browser';
    import { BrowserAnimationsModule } from '@angular/platform-browser/animations';
    import { ChartsModule } from '@progress/kendo-angular-charts';
    import { AppComponent } from './app.component';

    import 'hammerjs';

    @NgModule({
        bootstrap:    [AppComponent],
        declarations: [AppComponent],
        imports:      [BrowserModule, BrowserAnimationsModule, ChartsModule]
    })
    export class AppModule {
    }

and Use Chart in app.component.js

import { Component } from '@angular/core';

@Component({
  selector: 'app-root',
  template:'<kendo-chart>
  <kendo-chart-title text="Units sold"></kendo-chart-title>
  <kendo-chart-category-axis>
      <kendo-chart-category-axis-item [categories]="['Q1', 'Q2', 'Q3', 'Q4']">
      </kendo-chart-category-axis-item>
  </kendo-chart-category-axis>
  <kendo-chart-series>
    <kendo-chart-series-item type="bar" [gap]="2" [spacing]=".25" [data]="[100, 123, 234, 343]">
    </kendo-chart-series-item>
    <kendo-chart-series-item type="bar" [data]="[120, 67, 231, 196]">
    </kendo-chart-series-item>
    <kendo-chart-series-item type="bar" [data]="[45, 124, 189, 143]">
    </kendo-chart-series-item>
    <kendo-chart-series-item type="bar" [data]="[87, 154, 210, 215]">
    </kendo-chart-series-item>
  </kendo-chart-series>
</kendo-chart>',
 
})
export class AppComponent {
  title = 'chart-sample';
}

Kendo UI for Angular provides themes that you can use to style your application.

Currently, the suite ships the following themes:

Let us know your experience getting started with Kendo UI in Angular…

GTM Catalyst to Offer JetBrains Solutions to Enterprises in India

The development landscape is very heterogenous with multiple teams using different languages. Consequently, development teams need a myraid of solutions that support these languages and tools to accelerate the development.

Today we announce that we are partnering with JetBrains, a leading provider of development, deployment, and collaboration tools (a portfolio of 28 products). With it, GTM Catalyst Private Limited will offer local expertise and support to businesses leveraging JetBrains solutions and will also make them available for purchase in Indian Rupees.

As a JetBrains reseller channel partner, GTM Catalyst Private Limited now offers award-winning developer tools including:


1. IDE: IntelliJ IDEA, PyCharm, WebStorm, RubyMine, GoLand, AppCode, and PhpStorm.
2. Collaboration: Developers benefit from CI/CD tool TeamCity and Spaces.
3. Productivity extensions (.NET): ReSharper, dotTrace, and dotMemory.

The flagship product from JetBrains is IntelliJ IDEA, which maximizes Java developer productivity with its intelligent coding assistance and ergonomic design.

JetBrains IDE for professional developers, PyCharm can help developers using Python, the fastest growing language. For DevOps, CI/CD is supported via TeamCity which is a build management and continuous integration server from JetBrains.

Space, the recently launched all-in-one collaboration solution, provides a toolset for instant communication, software development, and team and project management.

“As a company, JetBrains has strived to make the strongest and most effective tools for software developers and teams. We are committed to supporting developers in India with our wide range of tools. We’re very happy to welcome GTM Catalyst Private Limited as our channel partner in India”, said Javed Mohamed, Regional Head – South Asia at JetBrains.

Read the press release here

How to use NLog with ASP.NET Core 2

We can make a ASP.NET Core app without any logging. But in the real world, we should use some form of logging. In this blog post we are providing an overview of 3rd Party Logging solutions such as NLog.

NLog is a flexible and free logging platform for various .NET platforms, including .NET standard. NLog makes it easy to write to several targets. (database, file, console) and change the logging configuration on-the-fly.

NLog supports the following platforms:

  • .NET Framework 3.5, 4, 4.5 – 4.8
  • .NET Framework 4 client profile
  • Xamarin Android
  • Xamarin iOs
  • Windows Phone 8
  • Silverlight 4 and 5
  • Mono 4
  • ASP.NET 4 (NLog.Web package)
  • ASP.NET Core (NLog.Web.AspNetCore package)
  • .NET Core (NLog.Extensions.Logging package)
  • .NET Standard 1.x – NLog 4.5
  • .NET Standard 2.x – NLog 4.5
  • UWP – NLog 4.5

Getting started with ASP.NET Core 2

Following are the steps to configure NLog in ASP.NET Core application

1. Add dependency in csproj manually or using NuGet

Install the latest:

  1. PM> Install-Package NLog  
  2. PM> Install-Package NLog.Web.AspNetCore 

Or in csproj:

<PackageReference Include="NLog.Web.AspNetCore" Version="4.9.3" />
<PackageReference Include="NLog" Version="4.7.6" />

2. Create a nlog.config file

Create nlog.config (lowercase all) file in the root of your project.

<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      autoReload="true"
      internalLogLevel="Info"
      internalLogFile="c:\temp\internal-nlog.txt">

  <!-- enable asp.net core layout renderers -->
  <extensions>
    <add assembly="NLog.Web.AspNetCore"/>
  </extensions>

  <!-- the targets to write to -->
  <targets>
    <!-- write logs to file  -->
    <target xsi:type="File" name="allfile" fileName="c:\temp\nlog-all-${shortdate}.log"
            layout="${longdate}|${event-properties:item=EventId_Id}|${uppercase:${level}}|${logger}|${message} ${exception:format=tostring}" />

    <!-- another file log, only own logs. Uses some ASP.NET core renderers -->
    <target xsi:type="File" name="ownFile-web" fileName="c:\temp\nlog-own-${shortdate}.log"
            layout="${longdate}|${event-properties:item=EventId_Id}|${uppercase:${level}}|${logger}|${message} ${exception:format=tostring}|url: ${aspnet-request-url}|action: ${aspnet-mvc-action}" />
  </targets>

  <!-- rules to map from logger name to target -->
  <rules>
    <!--All logs, including from Microsoft-->
    <logger name="*" minlevel="Trace" writeTo="allfile" />

    <!--Skip non-critical Microsoft logs and so log only own logs-->
    <logger name="Microsoft.*" maxlevel="Info" final="true" /> <!-- BlackHole without writeTo -->
    <logger name="*" minlevel="Trace" writeTo="ownFile-web" />
  </rules>
</nlog>

3. Enable copy to bin folder

or edit .csproj file manually and add:

<ItemGroup>
    <Content Update="nlog.config" CopyToOutputDirectory="PreserveNewest" />
</ItemGroup>
4. Update program.cs
using NLog.Web;
using Microsoft.Extensions.Logging;

public static void Main(string[] args)
{
    // NLog: setup the logger first to catch all errors
    var logger = NLog.Web.NLogBuilder.ConfigureNLog("nlog.config").GetCurrentClassLogger();
    try
    {
        logger.Debug("init main");
        CreateWebHostBuilder(args).Build().Run(); 
    }
    catch (Exception ex)
    {
        //NLog: catch setup errors
        logger.Error(ex, "Stopped program because of exception");
        throw;
    }
    finally
    {
        // Ensure to flush and stop internal timers/threads before application-exit (Avoid segmentation fault on Linux)
        NLog.LogManager.Shutdown();
    }
}

public static IWebHostBuilder CreateWebHostBuilder(string[] args) =>
    WebHost.CreateDefaultBuilder(args)
        .UseStartup<Startup>()
        .ConfigureLogging(logging =>
        {
            logging.ClearProviders();
            logging.SetMinimumLevel(Microsoft.Extensions.Logging.LogLevel.Trace);
        })
        .UseNLog();  // NLog: setup NLog for Dependency injection

5. Configure appsettings.json

The Logging configuration specified in appsettings.json overrides any call to SetMinimumLevel. So either remove "Default": or adjust it correctly to your needs.

{
    "Logging": {
        "LogLevel": {
            "Default": "Trace",
            "Microsoft": "Information"
        }
    }
}

Remember to also update any environment specific configuration to avoid any surprises. Ex appsettings.Development.json

6. Write logs

Inject the ILogger in your controller:

using Microsoft.Extensions.Logging;

public class HomeController : Controller
{
    private readonly ILogger<HomeController> _logger;

    public HomeController(ILogger<HomeController> logger)
    {
        _logger = logger;
    }

    public IActionResult Index()
    {
        _logger.LogInformation("Index page says hello");
        return View();
    }

7. Example Output

When starting the ASP.NET Core website, we get two files:

nlog-own-2017-10-10.log

2020-12-29 14:40:29.5143||DEBUG|ASP.NET_Core_2___VS2017.Program|init main |url: |action: 
2020-12-29 14:40:32.1326|0|INFO|ASP.NET_Core_2___VS2017.Controllers.HomeController|Hello, this is the index! |url: http://localhost/|action: Index

nlog-all-2017-10-10.log

2020-12-29 14:40:29.5143||DEBUG|ASP.NET_Core_2___VS2017.Program|init main 
2020-12-29 14:40:30.9739|0|INFO|Microsoft.AspNetCore.DataProtection.KeyManagement.XmlKeyManager|User profile is available. Using 'C:\Users\j.verdurmen\AppData\Local\ASP.NET\DataProtection-Keys' as key repository and Windows DPAPI to encrypt keys at rest. 
2020-12-29 14:40:30.9897|37|DEBUG|Microsoft.AspNetCore.DataProtection.Repositories.FileSystemXmlRepository|Reading data from file 'C:\Users\j.verdurmen\AppData\Local\ASP.NET\DataProtection-Keys\key-bfd1ce07-8dc6-4eef-a51a-d21ddb547109.xml'. 
2020-12-29 14:40:31.0004|18|DEBUG|Microsoft.AspNetCore.DataProtection.KeyManagement.XmlKeyManager|Found key {bfd1ce07-8dc6-4eef-a51a-d21ddb547109}. 
2020-12-29 14:40:31.0124|13|DEBUG|Microsoft.AspNetCore.DataProtection.KeyManagement.DefaultKeyResolver|Considering key {bfd1ce07-8dc6-4eef-a51a-d21ddb547109} with expiration date 2017-12-28 19:01:07Z as default key. 
2020-12-29 14:40:31.0422|0|DEBUG|Microsoft.AspNetCore.DataProtection.TypeForwardingActivator|Forwarded activator type request from Microsoft.AspNetCore.DataProtection.XmlEncryption.DpapiXmlDecryptor, Microsoft.AspNetCore.DataProtection, Version=2.0.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60 to Microsoft.AspNetCore.DataProtection.XmlEncryption.DpapiXmlDecryptor, Microsoft.AspNetCore.DataProtection, Culture=neutral, PublicKeyToken=adb9793829ddae60 
2020-12-29 14:40:31.0422|51|DEBUG|Microsoft.AspNetCore.DataProtection.XmlEncryption.DpapiXmlDecryptor|Decrypting secret element using Windows DPAPI. 
2020-12-29 14:40:31.0422|0|DEBUG|Microsoft.AspNetCore.DataProtection.TypeForwardingActivator|Forwarded activator type request from Microsoft.AspNetCore.DataProtection.AuthenticatedEncryption.ConfigurationModel.AuthenticatedEncryptorDescriptorDeserializer, Microsoft.AspNetCore.DataProtection, Version=2.0.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60 to Microsoft.AspNetCore.DataProtection.AuthenticatedEncryption.ConfigurationModel.AuthenticatedEncryptorDescriptorDeserializer, Microsoft.AspNetCore.DataProtection, Culture=neutral, PublicKeyToken=adb9793829ddae60 
2020-12-29 14:40:31.0422|4|DEBUG|Microsoft.AspNetCore.DataProtection.AuthenticatedEncryption.CngCbcAuthenticatedEncryptorFactory|Opening CNG algorithm 'AES' from provider '(null)' with chaining mode CBC. 
2020-12-29 14:40:31.0543|3|DEBUG|Microsoft.AspNetCore.DataProtection.AuthenticatedEncryption.CngCbcAuthenticatedEncryptorFactory|Opening CNG algorithm 'SHA256' from provider '(null)' with HMAC. 
2020-12-29 14:40:31.0543|2|DEBUG|Microsoft.AspNetCore.DataProtection.KeyManagement.KeyRingProvider|Using key {bfd1ce07-8dc6-4eef-a51a-d21ddb547109} as the default key. 
2020-12-29 14:40:31.0543|0|DEBUG|Microsoft.AspNetCore.DataProtection.Internal.DataProtectionStartupFilter|Key ring with default key {bfd1ce07-8dc6-4eef-a51a-d21ddb547109} was loaded during application startup. 
2020-12-29 14:40:31.4080|3|DEBUG|Microsoft.AspNetCore.Hosting.Internal.WebHost|Hosting starting 
2020-12-29 14:40:31.5508|4|DEBUG|Microsoft.AspNetCore.Hosting.Internal.WebHost|Hosting started 
2020-12-29 14:40:31.5508|0|DEBUG|Microsoft.AspNetCore.Hosting.Internal.WebHost|Loaded hosting startup assembly ASP.NET Core 2 - VS2017 
2020-12-29 14:40:31.5526|0|DEBUG|Microsoft.AspNetCore.Hosting.Internal.WebHost|Loaded hosting startup assembly Microsoft.AspNetCore.ApplicationInsights.HostingStartup 
2020-12-29 14:40:31.5526|0|DEBUG|Microsoft.AspNetCore.Hosting.Internal.WebHost|Loaded hosting startup assembly Microsoft.AspNetCore.Server.IISIntegration 
2020-12-29 14:40:31.6909|1|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK64" started. 
2020-12-29 14:40:31.6909|1|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK65" started. 
2020-12-29 14:40:31.7418|19|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK65" reset. 
2020-12-29 14:40:31.7418|10|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK65" disconnecting. 
2020-12-29 14:40:31.7418|7|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK65" sending FIN. 
2020-12-29 14:40:31.7591|2|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK65" stopped. 
2020-12-29 14:40:31.8153|1|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request starting HTTP/1.1 GET http://localhost:56152/   
2020-12-29 14:40:31.8607|4|DEBUG|Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware|The request path / does not match a supported file type 
2020-12-29 14:40:32.0160|1|DEBUG|Microsoft.AspNetCore.Routing.RouteBase|Request successfully matched the route with name 'default' and template '{controller=Home}/{action=Index}/{id?}'. 
2020-12-29 14:40:32.1120|1|DEBUG|Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker|Executing action ASP.NET_Core_2___VS2017.Controllers.HomeController.Index (ASP.NET Core 2 - VS2017) 
2020-12-29 14:40:32.1326|1|INFO|Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker|Executing action method ASP.NET_Core_2___VS2017.Controllers.HomeController.Index (ASP.NET Core 2 - VS2017) with arguments ((null)) - ModelState is Valid 
2020-12-29 14:40:32.1326|0|INFO|ASP.NET_Core_2___VS2017.Controllers.HomeController|Hello, this is the index! 
2020-12-29 14:40:32.1620|2|DEBUG|Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker|Executed action method ASP.NET_Core_2___VS2017.Controllers.HomeController.Index (ASP.NET Core 2 - VS2017), returned result Microsoft.AspNetCore.Mvc.ViewResult. 
2020-12-29 14:40:32.1620|1|DEBUG|Microsoft.AspNetCore.Mvc.Razor.RazorViewEngine|View lookup cache miss for view 'Index' in controller 'Home'. 
2020-12-29 14:40:33.6906|1|DEBUG|Microsoft.AspNetCore.Mvc.Razor.Internal.RazorViewCompiler|Compilation of the generated code for the Razor file at 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\Views\Home\Index.cshtml' started. 
2020-12-29 14:40:35.7180|2|DEBUG|Microsoft.AspNetCore.Mvc.Razor.Internal.RazorViewCompiler|Compilation of the generated code for the Razor file at 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\Views\Home\Index.cshtml' completed in 2024.1338ms. 
2020-12-29 14:40:35.7988|1|DEBUG|Microsoft.AspNetCore.Mvc.Razor.Internal.RazorViewCompiler|Compilation of the generated code for the Razor file at 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\Views\_ViewStart.cshtml' started. 
2020-12-29 14:40:35.8637|2|DEBUG|Microsoft.AspNetCore.Mvc.Razor.Internal.RazorViewCompiler|Compilation of the generated code for the Razor file at 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\Views\_ViewStart.cshtml' completed in 63.9912ms. 
2020-12-29 14:40:35.8710|2|DEBUG|Microsoft.AspNetCore.Mvc.ViewFeatures.Internal.ViewResultExecutor|The view 'Index' was found. 
2020-12-29 14:40:35.8710|1|INFO|Microsoft.AspNetCore.Mvc.ViewFeatures.Internal.ViewResultExecutor|Executing ViewResult, running view at path /Views/Home/Index.cshtml. 
2020-12-29 14:40:35.9577|1|DEBUG|Microsoft.AspNetCore.Mvc.Razor.RazorViewEngine|View lookup cache miss for view '_Layout' in controller 'Home'. 
2020-12-29 14:40:36.0454|1|DEBUG|Microsoft.AspNetCore.Mvc.Razor.Internal.RazorViewCompiler|Compilation of the generated code for the Razor file at 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\Views\Shared\_Layout.cshtml' started. 
2020-12-29 14:40:36.2080|2|DEBUG|Microsoft.AspNetCore.Mvc.Razor.Internal.RazorViewCompiler|Compilation of the generated code for the Razor file at 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\Views\Shared\_Layout.cshtml' completed in 161.8031ms. 
2020-12-29 14:40:36.2209|2|DEBUG|Microsoft.AspNetCore.Mvc.Razor.TagHelpers.HeadTagHelper|Tag helper component 'Microsoft.AspNetCore.ApplicationInsights.HostingStartup.JavaScriptSnippetTagHelperComponent' initialized. 
2020-12-29 14:40:36.2209|3|DEBUG|Microsoft.AspNetCore.Mvc.Razor.TagHelpers.HeadTagHelper|Tag helper component 'Microsoft.AspNetCore.ApplicationInsights.HostingStartup.JavaScriptSnippetTagHelperComponent' processed. 
2020-12-29 14:40:36.2367|2|DEBUG|Microsoft.AspNetCore.Mvc.Razor.TagHelpers.BodyTagHelper|Tag helper component 'Microsoft.AspNetCore.ApplicationInsights.HostingStartup.JavaScriptSnippetTagHelperComponent' initialized. 
2020-12-29 14:40:36.2367|3|DEBUG|Microsoft.AspNetCore.Mvc.Razor.TagHelpers.BodyTagHelper|Tag helper component 'Microsoft.AspNetCore.ApplicationInsights.HostingStartup.JavaScriptSnippetTagHelperComponent' processed. 
2020-12-29 14:40:36.2942|2|INFO|Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker|Executed action ASP.NET_Core_2___VS2017.Controllers.HomeController.Index (ASP.NET Core 2 - VS2017) in 4181.1451ms 
2020-12-29 14:40:36.3036|9|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK64" completed keep alive response. 
2020-12-29 14:40:36.3273|2|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request finished in 4515.4954ms 200 text/html; charset=utf-8 
2020-12-29 14:40:36.3273|1|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK67" started. 
2020-12-29 14:40:36.3273|1|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK66" started. 
2020-12-29 14:40:36.3386|1|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request starting HTTP/1.1 GET http://localhost:56152/lib/bootstrap/dist/css/bootstrap.css   
2020-12-29 14:40:36.3386|1|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request starting HTTP/1.1 GET http://localhost:56152/css/site.css   
2020-12-29 14:40:36.3610|2|INFO|Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware|Sending file. Request path: '/css/site.css'. Physical path: 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\wwwroot\css\site.css' 
2020-12-29 14:40:36.3610|2|INFO|Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware|Sending file. Request path: '/lib/bootstrap/dist/css/bootstrap.css'. Physical path: 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\wwwroot\lib\bootstrap\dist\css\bootstrap.css' 
2020-12-29 14:40:36.4312|9|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK66" completed keep alive response. 
2020-12-29 14:40:36.4312|2|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request finished in 90.8043ms 200 text/css 
2020-12-29 14:40:36.4312|9|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK67" completed keep alive response. 
2020-12-29 14:40:36.4312|2|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request finished in 98.4683ms 200 text/css 
2020-12-29 14:40:36.4710|1|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK68" started. 
2020-12-29 14:40:36.4710|1|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK69" started. 
2020-12-29 14:40:36.4819|1|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request starting HTTP/1.1 GET http://localhost:56152/lib/jquery/dist/jquery.js   
2020-12-29 14:40:36.4819|1|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request starting HTTP/1.1 GET http://localhost:56152/lib/bootstrap/dist/js/bootstrap.js   
2020-12-29 14:40:36.4819|1|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request starting HTTP/1.1 GET http://localhost:56152/images/banner2.svg   
2020-12-29 14:40:36.4819|1|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request starting HTTP/1.1 GET http://localhost:56152/js/site.js?v=ji3-IxbEzYWjzzLCGkF1KDjrT2jLbbrSYXw-AhMPNIA   
2020-12-29 14:40:36.4819|1|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK6A" started. 
2020-12-29 14:40:36.4819|2|INFO|Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware|Sending file. Request path: '/js/site.js'. Physical path: 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\wwwroot\js\site.js' 
2020-12-29 14:40:36.4819|2|INFO|Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware|Sending file. Request path: '/lib/jquery/dist/jquery.js'. Physical path: 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\wwwroot\lib\jquery\dist\jquery.js' 
2020-12-29 14:40:36.4819|2|INFO|Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware|Sending file. Request path: '/images/banner2.svg'. Physical path: 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\wwwroot\images\banner2.svg' 
2020-12-29 14:40:36.4933|9|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK67" completed keep alive response. 
2020-12-29 14:40:36.4819|2|INFO|Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware|Sending file. Request path: '/lib/bootstrap/dist/js/bootstrap.js'. Physical path: 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\wwwroot\lib\bootstrap\dist\js\bootstrap.js' 
2020-12-29 14:40:36.4933|2|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request finished in 20.2541ms 200 application/javascript 
2020-12-29 14:40:36.5143|9|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK66" completed keep alive response. 
2020-12-29 14:40:36.5143|2|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request finished in 32.361ms 200 image/svg+xml 
2020-12-29 14:40:36.5143|1|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request starting HTTP/1.1 GET http://localhost:56152/lib/bootstrap/dist/fonts/glyphicons-halflings-regular.woff2   
2020-12-29 14:40:36.5401|2|INFO|Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware|Sending file. Request path: '/lib/bootstrap/dist/fonts/glyphicons-halflings-regular.woff2'. Physical path: 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\wwwroot\lib\bootstrap\dist\fonts\glyphicons-halflings-regular.woff2' 
2020-12-29 14:40:36.5401|1|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request starting HTTP/1.1 GET http://localhost:56152/images/banner1.svg   
2020-12-29 14:40:36.5401|2|INFO|Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware|Sending file. Request path: '/images/banner1.svg'. Physical path: 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\wwwroot\images\banner1.svg' 
2020-12-29 14:40:36.5539|9|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK67" completed keep alive response. 
2020-12-29 14:40:36.5539|2|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request finished in 39.9074ms 200 font/woff2 
2020-12-29 14:40:36.5745|1|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request starting HTTP/1.1 GET http://localhost:56152/images/banner3.svg   
2020-12-29 14:40:36.5745|1|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request starting HTTP/1.1 GET http://localhost:56152/images/banner4.svg   
2020-12-29 14:40:36.5951|9|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK68" completed keep alive response. 
2020-12-29 14:40:36.6015|2|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request finished in 119.5389ms 200 application/javascript 
2020-12-29 14:40:36.6015|2|INFO|Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware|Sending file. Request path: '/images/banner4.svg'. Physical path: 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\wwwroot\images\banner4.svg' 
2020-12-29 14:40:36.5745|2|INFO|Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware|Sending file. Request path: '/images/banner3.svg'. Physical path: 'X:\nlog\NLog.Web\examples\ASP.NET Core 2\Visual Studio 2017\ASP.NET Core 2 - VS2017\wwwroot\images\banner3.svg' 
2020-12-29 14:40:36.6946|9|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK64" completed keep alive response. 
2020-12-29 14:40:36.6703|9|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK66" completed keep alive response. 
2020-12-29 14:40:36.6946|2|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request finished in 119.7561ms 200 image/svg+xml 
2020-12-29 14:40:36.6015|9|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK69" completed keep alive response. 
2020-12-29 14:40:36.7137|2|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request finished in 170.2078ms 200 image/svg+xml 
2020-12-29 14:40:36.7137|9|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK6A" completed keep alive response. 
2020-12-29 14:40:36.7560|2|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request finished in 181.4017ms 200 image/svg+xml 
2020-12-29 14:40:36.6946|2|INFO|Microsoft.AspNetCore.Hosting.Internal.WebHost|Request finished in 216.2838ms 200 application/javascript 
2020-12-29 14:42:21.6657|6|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK68" received FIN. 
2020-12-29 14:42:21.6657|6|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK67" received FIN. 
2020-12-29 14:42:21.6657|10|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK67" disconnecting. 
2020-12-29 14:42:21.6657|6|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK69" received FIN. 
2020-12-29 14:42:21.6657|10|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK69" disconnecting. 
2020-12-29 14:42:21.6657|7|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK67" sending FIN. 
2020-12-29 14:42:21.6657|10|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK68" disconnecting. 
2020-12-29 14:42:21.6657|2|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK69" stopped. 
2020-12-29 14:42:21.6800|6|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK66" received FIN. 
2020-12-29 14:42:21.6800|10|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK66" disconnecting. 
2020-12-29 14:42:21.6657|7|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK69" sending FIN. 
2020-12-29 14:42:21.6657|2|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK67" stopped. 
2020-12-29 14:42:21.6800|6|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK6A" received FIN. 
2020-12-29 14:42:21.6800|10|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK6A" disconnecting. 
2020-12-29 14:42:21.6800|6|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK64" received FIN. 
2020-12-29 14:42:21.6800|10|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK64" disconnecting. 
2020-12-29 14:42:21.6800|7|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK68" sending FIN. 
2020-12-29 14:42:21.6800|2|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK68" stopped. 
2020-12-29 14:42:21.6800|7|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK66" sending FIN. 
2020-12-29 14:42:21.6800|2|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK66" stopped. 
2020-12-29 14:42:21.6943|2|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK6A" stopped. 
2020-12-29 14:42:21.6943|7|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK6A" sending FIN. 
2020-12-29 14:42:21.6943|7|DEBUG|Microsoft.AspNetCore.Server.Kestrel.Transport.Libuv|Connection id "0HL8G4U42CK64" sending FIN. 
2020-12-29 14:42:21.6943|2|DEBUG|Microsoft.AspNetCore.Server.Kestrel|Connection id "0HL8G4U42CK64" stopped. 

The file format can be defined using the layout attribute in a target element.

NLog is easy to use and configure with ASP.NET Core application. In this post, I have explained about NLog – file logging configuration with ASP.NET Core 2.

How to Connect Samples of Telerik Reporting to MySQL

Some of our customers don’t use Microsoft SQL Server as their database. What is not well known is that Telerik Reporting supports MySQL database out of the box.

Below, I detail the steps that are required to design reports using Telerik Report Designer from MySQL database.

  • First, install MySQL Connecter from below link.

https://dev.mysql.com/downloads/connector/net/

  • Please download AdventureWorks database for MySQL from below link and import this to your MySQL.

https://sourceforge.net/projects/awmysql/

  • After installing MySQL Connector you can see the “MySQL Data Provider” in drop-down list please select this. 
  • Enter the connection string of Adventureworks database according to the below picture and click on the next button.
  • Change the Alias if you want and click on the next button.
  • Change query according to DB and click on the Next button.
  • Below the screen comes and click on the Execute Query button.
  • After clicking on the Execute Query button, If data appears on a Screen the task completed click on the Finish button or not check the query again.

Above, we have seen how to enable Telerik Reporting Designer to fetch data from MySQL using the MySQL ADO.NET connector. There is one additional step that you need to do to render the report.

This step is to add MySQL nuget package into the host application. The package to add is MySQL.Data . This adds the capability to connect with MySQL from Telerik Reporting Host application.

Now your report is able to render data from MySQL database.

Connect Telerik Reporting with Postgres SQL

One of our customers required their Telerik Reporting application to connect with their Postgres SQL (for effect, no not Microsoft SQL Server but the open source Postgres SQL).

While at the outset it may appear to be almost impossible but this task is very easy to implement with Telerik Reporting.

We accomplish this with using ADO.NET driver for Postgres SQL. The same approach can be used for other databases like MySQL.

In this post, we will discuss how to get Telerik Reporting working on .NET Core 3.1.

The first thing is to understand that Telerik Reporting is a framework that has three distinct and independent pieces:

  1. Telerik Report Definition
  2. Telerik Reporting Host Application
  3. Telerik Report Viewer

First you would want to create the Telerik Report Definition (a trdp file). To create this, we use the Telerik Reporting Designer. By default, there is nothing that supports Postgres SQL in there. So, here is the first step – Download the NgpSql driver (the MSI installer).

Once done, this will add a new datasource to the SQL Data Source of the designer:

Click on the SQL Data Source, and add a new data connection. In the dropdown, please select the new available provider “Ngpsql Data Provider”:

Following this you will need to provide the connection string. The Postgres connection string is of the following format:

Host=<server name>;Database=<db name>;Username=<username>;Password=<password>

Provide the relevant SQL statement and complete creating the data connection. The rest of the remaining steps are the same as creating a regular Telerik Report definition.

Make sure that the data is as preview in the report.

The second step, is to configure the hosting application. Since we are now working with .NET core, you can start with the .NET Core WebAPI application template and add relevant nuget packages to the same. Detailed instructions are available here: https://docs.telerik.com/reporting/telerik-reporting-rest-service-aspnetcore-mvc-core3

This will make the host application provide the Telerik Reporting Service. You can check if this the reporting has been correctly setup by browsing to the URL: http://localhost/api/reports/formats

The extra step in the host application is to include an additional nuget package in the host application: Ngpsql

The next change in the host application is changing the connection string and its provider. Specify your connection string in the appSettings.json as follows:

Pay special attention to the providerName above.

Congratulations you are done!!

The third and final piece, the Telerik Report Viewer doesn’t require any changes.

Your host application (in my case a simple HTML 5 application) can now simply render the report from the host application.

References:

https://docs.telerik.com/reporting/knowledge-base/configuring-postgres-with-npgsql

https://www.telerik.com/forums/configure-standalone-report-designer-for-postgresql-data-source

https://docs.microsoft.com/en-us/aspnet/core/security/cors?view=aspnetcore-5.0#enable-cors

How to Download Telerik Software

Login to your account at Telerik.com

There are two ways to download Telerik products, doing so online as laid out in the instructions below or using the Control Panel, which is available for download from your account home page.

Once logged into your account, click the “Products & Subscriptions” button (Please note, the account in the screenshot is a test account, you may have different products listed other than DevCraft Complete)

Next, click on the appropriate product

On the next screen, choose the blue “Download Installer and other resources” button

Click on the product that you wish to install

On the next page, choose to either download the product or the latest internal build of the product. Please make ensure that the License type is “Purchase” and not “Trial”

Webinar: Starting out with aPaaS

Telerik has been at forefront of providing tools for improving developer experiences. Now, a new kind of technology is on the horizon called as aPaaS – Application Platform as a Service.

This technology enables enterprises to create better, cheaper and faster applications with very little code. This is going to be a big boost of developers who are constantly required to deliver top quality code in minimal time. This single technology can enable building visually immersive experiences across web, iOS and Android and engaging chatbots. Not just that, it provides full control over the application code and development experience for developers.

This platform is called Progress Kinvey Studio. In this webinar we will introduce what is aPaaS and Kinvey.

The webinar details are as follows:

When: Thursday, Sept 19 2019, 15:00 – 16:00 hrs (IST)

Register here: https://www.techgig.com/webinar/Beginning-aPaaS-Low-Code-Development-for-Web-Mobile-and-Chat-in-the-Real-World-1608

Presenter: Mr. Abhishek Kant, CEO, GTM Catalyst Pvt Ltd

Who should attend: Project Managers, Developers and CTOs

In the webinar Mr. Abhishek will go over:

*Understand aPaaS concepts
*Visual designer for building apps in a matter of minutes
*Write custom logic and UI to finetune the digital experience
*Round-trip code editing (legible code, and editing in your tool of choice) Code portability (no vendor lock-in)
*Enterprise data and authentication integration (use existing data sources)
*Access to cutting-edge features like Augmented Reality and Chatbots
*Simultaneous web, iOS, and Android development

In the Telerik tradition, we will be giving out 3 T-Shirts to top engaged participants at the webinar.

See you at the webinar!

Code-less way to authenticate to Azure Resource Manager API from Azure App Services

This is a guest post by Sujay Sarma:

Typical examples that show you how to connect from a web application to Azure Resource Manager API have you wading through configuring and meddling with OAuth and Owin, not to mention getting you confused between ADAL, MSAL and the different types of Active Directory tenants offered by Azure. We do not need ANY of that, especially if your web application is going to live on as an Azure App Service.

Teeth gnashing? Mouth Salivating?

dreamstime_xl_43928193The short answer is to use “Azure App Service Authentication”. And it is nothing new. It has been around since at least November 2014 (wow! a little over four years since!). At least for me, though I have seen it plenty of times while configuring my App Services in Azure, I have scarcely looked at what it can do. Until now.

A project I was working on for a client required authenticating Azure subscribers to the portal. Initially, I went with the regular walkthroughs. I went into my Azure Active Directory blade and under App Registrations, created a new app, secrets and so on. But I faced really strange issues: There were no issues for me (developer never faces issues and has zero bugs on their dev box, yeah?). But my client contact could not login. He was using a Hotmail login address. I had to add him as a Guest User in my Active Directory tenant! That was not going to be a viable action plan for any further step of the project.

The problem, I determined, was that folks on my tenant could log in, but not others. Strangely, another friend was able to login — his account was a custom domain hosted within another Azure Active Directory Tenant (he was using his Organization ID and apparently they were Azure subscribers as well).

So, I tried to use Azure B2C. This is an poorly documented system, where the current documentation and the portal’s user experience are so different. Not only that, there is a lot of confusing terminology used in the documents — and you have to register “apps” in at least three places, not to mention the Web App I was trying to configure! Short story: It was a mess!!!

I told everybody I was giving up on the issue. We would find some “manual” way to get people to authenticate. That was when, an unrelated Google search threw up the page on App Service Authentication. I sent the URL to my mobile to read it during dinner and turned off my computer for the day. Even after I had read the article in question, I only thought of writing a small POC to see what it could do. The next morning, I sat down to set it up. And boy, oh boy! was I in for a pleasant surprise!

To save people the trouble of having go through the same trial and error I did, here is a concise walkthrough of how to do it. I must thank Chris Gillum, whose 2016 blog post clued me into the right course correction to get everything working.

The Walkthrough

  1. Log in to the Azure Portal.
  2. If it is already on the menu on your left/right hand-side, use that. Otherwise, click “All Services” and search there. Go into Azure Active Directory”. If you’re having a hard time chasing it down, click here to go there directly (you maybe prompted to login).
  3. Now click on the “App registrations (Preview)” item. The official documentation follows the flow of going into the other “App registrations” — do not do that, that will end up giving you “OAuth v1.0” tokens. We need “OAuth v2.0” tokens. I found this out by trial and error. Click here to go to the right blade.
  4. Open a Notepad window.
  5. Along the top of the applications view, find the button that says “Endpoints” and click that. Towards the bottom of the pane full of URLs, find the one that says “WS-Federation sign-on endpoint” (third from the bottom at this time). Copy that FULL address and paste it into your Notepad. Now, in Notepad, carefully delete the “/wsfed” from the end of that address — be careful not to delete anything before the “/”. To be safe, you can hit CTRL+H, in Find, type “/wsfed”, leave Replace as blank and hit “Replace All”.
  6. Now hit “+ New Registration”. Enter any name. Be aware that what you enter here will be shown in big bold letters when Azure later asks the user trying to login for consent (the “…. is asking you for permission to access…” UI). Select the option “Accounts in any organizational directory”. Leave the “Redirect URI” blank for now, we will come back to it later. Click on Register.
  7. Once the Azure Portal tells you that the application was deployed successfully, find it again in the same “App Registrations (Preview)” screen and click on it to enter it.
  8. From the overview page, find the “Application (client) ID”. It will be a Guid. Hover on the value to make the “copy” icon appear. Click it to copy it.
  9. Switch to your Notepad window, type in “App ID”, hit ENTER and paste what you copied on step 7.
  10. OPTIONAL. Back on the Azure Portal, go into “Branding” and upload a logo. a picture of size 48×48 pixels works best. Anything else will cause the consent screen to appear in strange shapes and sizes. What you enter into the various URL fields there is not relevant — they will be used to show information links at the bottom of the consent screen. You may leave them blank or enter valid URLs into them — they need not even be on your website!
  11. IMPORTANT. Go into “Certificates & Secrets”. Under the “Client secrets” heading, click “+ New client secret”. Enter a name (does not matter, it is for your convenience), select an expiry value (“Never” is all you need). Click Add. When the new password is generated, it will be shown there. Again, hover on the value under the “VALUE” heading to make the little icon appear and copy it (if you don’t copy it fully, you will be in a world of pain).
  12. Switch to your Notepad window, type in “Secret”, hit ENTER and paste what you copied in step 10.
  13. IMPORTANT. Go into “API Permissions”. Click on “+ Add a permission”. Select “Microsoft Graph” (at the time of writing this, it is a large banner like button right on top of the list that appears). Select “Delegated Permissions”. Check ON: email, offline_access, openid and profile. Scroll to the bottom and find “User”, expand it and check ON “User.Read”. Click “Add permissions” at the bottom.
  14. Now click the “+ Add a permission” again. This time, select “Azure Service Management” (there are many similar looking “Azure” permissions on the list, select the right one). There is only one permission at this time, select it (or select “user_impersonation” if you find more permissions when you’re reading this!). Click “Add permissions” at the bottom.
  15. Right-click on the “App Services” menu item on the navigation (or find it under “All Services”) and select to open it in a new tab — you need to come back to the settings you were working with so far — we are not done there yet! Anyway, end up here.
  16. NOTE: If you already have an app service that you are configuring this for, you can use that. Otherwise, create a new app service. There is nothing special to be done there — and you don’t need to upload any code YET. Once you have selected the app service or created one, continue below.
  17. Select the App Service, open its Authentication/Authorization blade. Set the “App Service Authentication” option to “On”. Immediately a bunch of options will appear below it.
  18. We are only interested in the “Azure Active Directory” option in this walkthrough. So select that. A new blade will open.
  19. Select “Advanced”. A different set of options will appear under it.
  20. For “Client ID”, paste the value pasted under “App ID” from your Notepad window.
  21. Under “Issuer Url” paste the URL you pasted from Step 5 above (it will look like “https://login.microsoftonline.com/…&#8221;).
  22. Under “Client Secret”, paste in the value under “Secret” from your Notepad window.
  23. Now, this is very important. Under “Allowed Token Audiences”, first paste in “https://management.core.windows.net/&#8221;, tab out. Another text box will appear. Now paste in “https://management.azure.com/&#8221; (ensure the final “/” are there). This tells the system to get you the Bearer Tokens that will work with the Azure REST API 🙂 This is the secret magic sauce to the whole thing!
  24. Click OK.
  25. Back on the “Authentication / Authorization” blade, select one of “Allow Anonymous requests (no action)” or “Login with Azure Active Directory”. If you plan to show a “Sign in” link on your website — that is, you want the user to see something before they need to login, then use the “Allow Anonymous requests (no action)” option. If like with the Azure Portal, you want them to be signed in from the get-go, use the “Login with Azure Active Directory” option.
  26. Ensure “Token Store” (bottom of the page, under “Advanced settings”) is “On”.
  27. Click “Save” on top of the page to save everything.
  28. IMPORTANT CAVEAT: If at any point of time, you make changes to this set up, you will need to Restart your App Service before it will use the new values.
  29. Go into the “Overview” blade of your App Service. Wait for all the properties on the top panel to load, and copy the full value of the “URL” (“https://xyz.azurewebsites.net&#8221;). Note how this is “https” ?
  30. Now go back to the Azure Directory screen — if you left it open in another tab or window at the end of step 14, switch to that tab. Otherwise, navigate to it from the menu, or click here.
  31. Ensure you are within the Azure Directory Application (the one you configured from step 3 to 14).
  32. Click into the “Authentication” tab.
  33. Under “Redirect URIs”, ensure “Web” is selected for “TYPE”, under “REDIRECT URI”, paste in the URL to the App Service (Step 29). At the end of this URL, paste in “/.auth/login/aad/callback” [be careful to paste in everything between the quotes]. Note that there is a dot (“.”) in front of “auth”. Your final URL should look like: “https://contoso.azurewebsites.net/.auth/login/aad/callback“.
  34. Scrolling down, under Advanced settings, paste/enter the Logout URL to make it thus: “https://contoso.azurewebsites.net/.auth/logout“. Again, note the “.” in front of “auth”.
  35. Scrolling down, under “Implicit grant”, check ON both “Access tokens” and “ID tokens”.
  36. Click Save above.

Your Azure configuration is DONE.

From your App Service Code

Fire up Visual Studio, create a new web application. I am using a regular Web Forms application. You are free to do this in MVC or .NET Core or whatever. I am using Visual Studio 2019, and selected the “ASP.NET Web Application (.NET Framework)” option. If you are prompted to select the type of authentication — leave it as “No authentication”.

You do not need to install new NuGet packages! Azure App Service automatically fetches the right Bearer token for you (without any plumbing!). This is available to you in the Request Header “X-MS-TOKEN-AAD-ACCESS-TOKEN”. Fetch it from:

string token = Request.Headers[“X-MS-TOKEN-AAD-ACCESS-TOKEN”];

You can now pass this token to your AzureRM REST API calls. I do not use any of the Azure SDKs to talk to AzureRM, and write System.Net.HttpClient based GET/PUT/etc calls. My code to pull all the subscriptions for a logged in user now looks like this:

HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, “https://management.azure.com/subscriptions?api-version=2019-03-01&#8221;);

request.Headers.Authorization = new AuthorizationHeaderValue(“Bearer”, Request.Headers[“X-MS-TOKEN-AAD-ACCESS-TOKEN”]);

HttpResponseMessage response = await client.SendAsync(request);

Simple, huh?

The “Post Digital” Enterprise

choiceDigital Transformation has graduated from being a differentiating advantage to now being the price of admission. Enterprises must now use the data and experiences collected in the earlier phase and use powerful new technologies to innovate in their business models and personalize experiences for their customers.

According to a recently released Accenture Technology Vision 2019 report, nearly four in five (79 percent) of more than 6,600 business and IT executives worldwide  surveyed believe that digital technologies ― specifically social, mobile, analytics and cloud ― have moved beyond adoption silos to become part of the core technology foundation for their organization. Respondents were C-level executives and directors at companies across 27 countries and 20 industries, with the majority having annual revenues greater than US$6 billion.

5 technology trends identified in the report, that can provide the elusive competitive edge to the willing enterprise are as follows:

  1. DARQ Power: This newly coined phrase stands for distributed ledgers, artificial intelligence, extended reality and quantum computing (DARQ). Amongst these 41 percent of executives ranked AI as number one in terms of impact.
  2. Unlock new opportunities: Leverage data captured from interactions to deliver rich, individualized, experience-based relationships. More than four in five executives (83 percent) said that digital demographics give their organizations a new way to identify market opportunities for unmet customer needs.
  3. Human+ Worker: The typical employee is digitally savvy with two-thirds (71 percent) of executives believe that their employees are more digitally mature than their organization, resulting in a workforce “waiting” for the organization to catch up.
  4. Security Flow: Security is no more limited to enterprise boundaries. The interconnectedness between these connections increase companies’ exposures to risks. Only 29 percent of executives said they know their ecosystem partners are working diligently to be compliant and resilient with regard to security.
  5. Meet consumers at the speed of now: Direct digital access to customers and powerful analytics capabilities enable innovative personalization strategies. Six in seven executives (85 percent) said that the integration of customization and real-time delivery is the next big wave of competitive advantage.

The focus on personalisation as a differentiator is foremost in the report. I had highlighted a similar trend in an interview with Marketing with Maveriks podcast in Jan 2019.

From my universe, Sitefinity is a wonderful digital marketing tool that provides for static and dynamic personalisation across channels.

Have thoughts around where we are headed? Leave a comment below..

Awarded 10 Most Promising Machine Learning Solution Providers

We are pleased to share that GTM Catalyst has been awarded as “10 Most Promising Machine Learning Solution Providers” in India by CIOReview.CIOReviewMostPromising

Here is a brief introduction of the award by CIOReview Magazine:

CIOReview India presents a list of “10 Most Promising Machine Learning Solution Providers”. Being closely scrutinized by a distinct panel of judges including CEOs, CIOs, CXO, analysts and CIOReview editorial board, we believe these solution vendors can bridge the gap between businesses and solution providers that are transforming business processes through their significant offerings.

Few extracts from the feature:

The top use case for AI has been customer facing technologies like conversation chatbots. Acknowledging these needs Gurgaon headquartered GTM Catalyst provides customer facing AI solutions notably the conversational chatbots under the KatalystAI platform that have been utilized by industries such as medical institutions, consumer electronics,
and automobiles sector.

KatalystAI platform is a System of Intelligence for enterprises.

The KatalystAI platform of GTM Catalyst serves to augment the robust sales forecasting processes that exist within an organization by employing the latest Machine Learning algorithms like boosted decision trees and deep learning algorithms.

The entire feature can be downloaded from here.

How-To: Create Charts with Kendo UI with Remote Data

If you know jQuery, and want to include data vizualization elements in your web page without all the hassle, you are at the right place. In this post, we are going to give you a quick view of how Kendo works with jQuery to create a pie chart.

We will build a ratings pie chart, step by step. Final product is shown below.

1. API

We need access to an API which we can call to get our remote data in the form of json. An API like this:

https://<some-url>/totalratings

which gives data in the form of json like this:

[
  {
    category: "Asia",
    value: 53.8,
    color: "#9de219"
  },
  {
    category: "Europe",
    value: 16.1,
    color: "#90cc38"
  },
  {
    category: "Latin America",
    value: 11.3,
    color: "#068c35"
  },
  {
    category: "Africa",
    value: 9.6,
    color: "#006634"
  },
  {
    category: "Middle East",
    value: 5.2,
    color: "#004d38"
  },
  {
    category: "North America",
    value: 3.6,
    color: "#033939"
  }
]

should do the work. The data must be a json or an array of json.

Note: If you’re the developer of the API, then make sure to modify the json to make it compatible with Kendo before sending it as response. Check out Kendo demos for more info.

2. Download

Now you need to download Kendo UI. There are several paid versions, and a free (trial) version. Trial is more than enough for trying it out.

Download Kendo UI for a trial period from here. You will have to sign up to download it.

3. Transport

Extract the downloaded ZIP archive to an easily accessible location. We are going to need it’s js and css folders.

4. Kickstart

Kickstart the project by creating a new folder, say kendo-pie. Copy the downloaded js and css folders in kendo-pie.

Now, create a new HTML file in kendo-pie, say index.html. This is our main webpage. The pie chart will reside here.

5. The HTML

Open index.html with your favourite text editor. Add some starter code.

Give it a title, say Overall Ratings. Link all the necessary js and css files, inside head.

Time to populate the body. Create a wrapper (div), with id overall. The actual chart element and it’s script will reside in this wrapper. Create the chart div inside the wrapper, with id chart. Give it some style with a style attribute.

The above goes inside body, and the whole thing up-to this point looks something like this:

6. The jQuery

Create a script element inside the wrapper, and add some starter jQuery code.

Inside the document-ready function, select the chart element with jQuery’s id selector, and apply kendoChart() method.

7. The Kendo

kendoChart() takes a configuration object as an argument. This configuration object is used to describe the chart and include data (local or remote) to be represented.

Let’s contsruct the configuration object:

  1. Add title property.

2. Add dataSource property: read and dataType.

3. Add seriesDefaults property.

4. Add series property: field and categoryField.

5. Add tooltip property.

kendoChart() method is ready. So is the script. Coding part is complete.

These were the basic steps to create a pie chart using jQuery and Kendo, mostly Kendo. Now, open index.html in browser, and you should see output as below.

I hope the above steps were helpful in giving you a basic idea about Kendo UI. It’s up to you now to tweak the chart however you want, or create a new element altogether.

Documentation on the customization options are available here, and demos here.

Authored by: Abhay Kumar.

How-To: Create Beautiful Charts with Kendo UI with Local Data

If you know jQuery, and want to include data-viz elements in your web page without all the hassle, you are at the right place. I am going to give you a gist of how Kendo works with jQuery to create robust data-viz elements.

We will build a ratings pie chart, step by step. Final product is shown below.

1. Download

First things first. You need to download Kendo UI. There are several paid versions, and a free (trial) version. Trial is more than enough for trying it out.

Download Kendo UI for a trial period from here. You will have to sign up to download it.

2. Transport

Extract the downloaded ZIP archive to an easily accessible location. We are going to need it’s js and css folders.

3. Kickstart

Kickstart the project by creating a new folder, say kendo-pie. Copy the downloaded js and css folders in kendo-pie.

Now, create a new HTML file in kendo-pie, say index.html. This is our main webpage. The pie chart will reside here.

4. The HTML

Open index.html with your favourite text editor. Add some starter code.

Give it a title, say Overall Ratings. Link all the necessary js and css files, inside head.

Time to populate the body. Create a wrapper (div), with id overall. The actual chart element and it’s script will reside in this wrapper. Create the chart div inside the wrapper, with id chart. Give it some style with a style attribute.

The above goes inside body, and the whole thing up-to this point looks something like this:

5. The jQuery

Create a script element inside the wrapper, and add some starter jQuery code.

Inside the document-ready function, select the chart element with jQuery’s id selector, and apply kendoChart() method.

6. The Kendo

kendoChart() takes a configuration object as an argument. This configuration object is used to describe the chart and include data (local or remote) to be represented.

Let’s contsruct the configuration object:

  1. Add title property.

2. Add legend property.

3. Add some defaults.

4. Add series properties: type of chart and local data.

5. Add tooltip property.

kendoChart() method is ready. So is the script. Coding part is complete. Wrapper should look like this.

These were the basic steps to create a pie chart using jQuery and Kendo, mostly Kendo. Now, open index.html in browser, and you should see output as below.

I hope the above steps were helpful in giving you a basic idea about Kendo UI. It’s up to you now to tweak the chart however you want, or create a new element altogether. There are loads available. Docs are available here, and demos here.

Note: This post is authored by Mr. Abhay Kumar, interning with GTM Catalyst (distributor of Telerik controls in India).

How-To: Connect Your Node.js App with SQL Server

Node.js is an exciting technology that has been widely adopted. For those starting out, one of the key requirements is the ability to connect node.js with an enterprise RDBMS such as MS SQL Server.

In this post, we will guide you through the process of connecting your Node.js app with SQL Server successfully, and hopefully, without any errors, doubts or confusions.

Let’s get started!

1. Download

Before getting started on the mission, we need a couple of things:

  1. SQL Server 2017 Express Edition* from here, and
  2. SQL Server Management Studio (SSMS) 17.8* from here.

I am assuming you have Node.js installed on your PC.

*Version number might differ.

2. Install

Installation is easy. Double-click the SQL Server installer downloaded earlier, named something like SQLServer2017-SSEI-Expr*.

*Again, version number might differ.

Click Basic, then click Accept, and finally click Install.

After successful installation, you are greeted with a final screen containing information like Instance Name, SQL Administrators, Features Installed, Version, and also locations of various things including helpful Resources.

A row of four buttons is present at the bottom, containing: Connect Now, Customize, Install SSMS, and Close.

Close is pretty obvious, and we don’t need to touch Customize.

a. Connect Now

An instance of SQL Server starts running in the background automatically after successful installation (until you stop it manually).

The Connect Now button is a way to connect to that instance without any login. You can execute T-SQL statements right in the terminal.

Press the button, a new SQLCMD terminal window will open up. Terminal is all yours. T-SQL away!

b. Install SSMS

The Install SSMS button will take you to the same download page mentioned in Download above.

If you didn’t download SSMS earlier, now is the time. And then, just install it. Simple install, no worries.

3. Configure

OK! It’s time for some configurations:

  1. Enable TCP/IP to allow remote connections, and
  2. Enable default login or create a new one.

The default login in SQL Server is sa, stands for System Administrator (aka, sysadmin). It is disabled by default (I don’t know why). You need to enable it, or create a new sysadmin login for yourself.

1. Enable TCP/IP to Allow Remote Connections

Search in Start Menu for SQL Server Configuration Manager. Open it.

You can see that SQL Server (SQLEXPRESS) service is running, and it’s Start Mode is Automatic. Like I said earlier.

If you observe the left pane, you are in SQL Server Services section. Expand SQL Server Network Configuration, and click on Protocols for SQLEXPRESS. You can see TCP/IP is disabled by default. Right-click and Enable it.

Now, we need to set the default TCP port, which for SQL Server is 1433. Double-click on TCP/IP. Click on IP Addresses tab. Scroll down to the bottom to reach IPAII section. Clear TCP Dynamic Ports field and leave it empty. Set TCP Port to 1433. Click OK.

Restart SQL Server (SQLEXPRESS) service, and you are done with first configuration. Onto next one!

2. Enable Default Login or Create a New One

Search in Start Menu for SQL Server Management Studio. Open it.

You are greeted with a dialog box to connect to the server. You have to connect via Windows Authentication because you don’t have a sysadmin login right now to connect via SQL Server Authentication. Exactly the point of this configuration. Click Connect.

On the left, there is an Object Explorer pane. Here you can manage your server: creating and deleting logins, creating and deleting databases, and loads of other admin things, so to say.

Let’s enable the sa login. Expand Security. Expand Logins. You can see a little red cross on sa’s icon. This shows that the login is disabled.

Double-click sa. In the left pane, click Status. Select Enabled under Login in Settings. Click General in the left pane, change password, and click OK. Bam! You have a sysadmin login now.

You can try re-connecting to the server with this newly enabled login, or the one you create. Click File > Disconnect Object Explorer to disconnect. Click File > Connect Object Explorer…, this time, selecting SQL Server Authentication in the Authentication drop-down menu. Enter sa as Login, and the password you chose earlier as Password.

If you want to create a new login:

  1. Connect to server, if not already.
  2. Expand Security in the left pane.
  3. Right-click Logins.
  4. Select New Login…
  5. Enter Login name.
  6. Select SQL Server authentication.
  7. Enter and re-enter Password.
  8. Click Server Roles in the left pane.
  9. Select sysadmin.
  10. Click OK.

You have successfully configured your SQL Server.

Errors

Nobody wants errors. But sometimes, they are inevitable. You may encounter one of the two errors when you are trying to connect your Node.js app with SQL Server:

  1. ESOCKET: TCP/IP is disabled. Perform first configuration to get rid of this error.
  2. ELOGIN: Unable to login. Perform second configuration to get rid of this error.

4. Connect

Let’s create the simplest Node.js app, and connect it to SQL Server.

Create a new folder, say node-sql. Execute npm init in this folder to create package.json.

We need a Node.js driver for SQL Server. tedious is one such driver. Execute:

npm install tedious --save

Create a new index.js file (which will be the main entry point for our app) in node-sql. Open index.js with your favourite text editor.

‘Require’ required modules in the app.

const Connection = require('tedious').Connection;
const Request = require('tedious').Request;

Create a configuration object (config) to be used while connecting to the database.

const config = {
  userName: 'sa', // update
  password: 'your_password', // update
  server: 'localhost',
  options: {
      database: 'SampleDB' // update
  }
}

Use your preferred userName, password and database. Create new Connection object with the earlier created config object.

const connection = new Connection(config);

Try to connect to the database with newly created connection object.

connection.on('connect', function(err) {
  if (err) {
    console.log(err);
  } else {
    console.log('Connected');
  }
});

Your simplest Node.js app looks like this:

const Connection = require('tedious').Connection;
const Request = require('tedious').Request;

const config = {
  userName: 'sa', // update
  password: 'your_password', // update
  server: 'localhost',
  options: {
      database: 'SampleDB' // update
  }
}
const connection = new Connection(config);

connection.on('connect', function(err) {
  if (err) {
    console.log(err);
  } else {
    console.log('Connected');
  }
});

Execute:

node index.js

If you see this in console:

Connected

Congrats! You have successfully connected your Node.js app with SQL Server. If you are getting any errors, then refer the Errors section above.

I hope this article was helpful in giving you a quick overview of connecting your node.js application with MS SQL Server.

Note: This post is authored by Mr. Abhay Kumar, interning with GTM Catalyst (distributor of Telerik controls in India).