Jornada: Microsoft Azure en Educación e Investigación

cartel_A3El próximo 10 de mayo de 2018, en el Aula 1.4 de la Escuela Superior de Ingeniería y Tecnología, tenemos la suerte de tener al equipo de Educación de Microsoft España en Tenerife para presentarnos una jornada sobre Microsoft Azure en el  ámbito de la Educación y la Investigación.

Estas jornadas tienen como objetivo mostrar a los usuarios, instituciones y empresas de Canarias diversos proyectos y experiencias de gran interés relacionados con la tecnología de computación en la nube utilizando la plataforma Microsoft Azure. Desde la Universidad de La Laguna, y a través del Grupo Taro en colaboración con diferentes empresas, se ha comenzado una línea de trabajo para proporcionar un conocimiento en el uso de tecnologías en el ámbito de las TIC así como en el desarrollo de proyectos conjuntos con empresas e instituciones que permitan ser competitivos desde Canarias.

Para acceder al evento necesitarás estar registrado siguiendo este enlace:
Regístrate aquí al Evento

Desde Intelequia hemos preparado conjuntamente con Microsoft una serie de sesiones para mostrar algunos aspectos de la plataforma de Microsoft Azure orientados a sacar el máximo rendimiento en ámbitos educativos y de investigación. Entre las sesiones tendremos la suerte de contar de nuevo con Sebastián Hidalgo, investigador del IAC, para contarnos los detalles de uno de los proyectos sobre el que ha estado trabajando, “The Secret Life of the Galaxies: Unveiling the true nature of their star formation” y que fue presentado el año pasado en Madrid a través del evento Global Azure Bootcamp, en el que más de 10.000 participantes colaboraron en el mismo a través de una granja de cómputo a nivel global.

SELIGA

Aparte de esa sesión, la agenda está llena de interesantes charlas sobre inteligencia artificial y de cómo serán las aplicaciones del futuro, un futuro que ya está aquí.

  • 9:30-9:45 Bienvenida | Óscar Sanz (Director de Educación) | Microsoft
  • 9:45-10:30 La vida secreta de las galaxias: desvelando la naturaleza de su formación estelar | Sebastián Hidalgo (Investigador) | Instituto de Astrofísica de Canarias (IAC)
  • 10:30-11:15 Construyendo las apps del futuro | David Rodriguez (CTO) | Intelequia
  • 11:15-11:30 Pausa café
  • 11:30-12:15 Contenedores, Kubernetes, y el nuevo modelo de DevOps en la nube | Diego Martinez (Solution Specialist) | Microsoft
  • 12:15-13:00 Azure Computer Vision and Tensorflow | Alexander González (Microsoft Student Partner) | UEM
  • 13:00-13:45 Cómo intenté hacerme rico con la nube y fracasé estrepitosamente | Alberto Marcos (Responsable de Universidades) | Microsoft
  • 13:45-14:00 Cierre | Óscar Sanz (Director de Educación) | Microsoft

¡Nos vemos el 10 de mayo!

DNN Azure AD Provider 3.0

Hi! After another bunch of work with React and the DNN Persona Bar, and with the special collaboration of Microsoft Azure MVP Cesar Abreu (@cesabreu), the new version of the DNN Azure AD Provider has been published.

Download DNN Azure AD Provider 3.0 from GitHub

There are several features on this release:

  • New Persona Bar integration: the Azure AD provider now has an area on the persona bar to easily setup the provider without having to dig into the authentication providers submenus. This area was also needed to have an starting point for the new features coming on the next release such as role sync and claims/profile mapping);
  • Auto-redirect: when you are only using the Azure AD provider, you would probably like to directly go to the AD login page without doing a previous stop on the DNN login page. This option just make this. If you still need, as an admin, to login with DNN credentials when this option is enabled, you can pass “legacy=1” on the query string of your login page in your browser so you can login as a regular DNN user. In any case, remember you can make a Azure AD user as super user of your site;
  • Setup simplification: on the previous releases, two applications needed to be setup in your Azure AD. This has now been simplified and only one App registration is needed. The other parameters have been simplified as well, so you only need to specify the Azure Tenant ID, the App ID and secret. Check the setup instructions on GitHub.
  • Logout: when I initially created the provider, followed the other DNN authentication providers patterns (Twitter, Google, etc.) where the logout process is not actually implemented, so when logging out, the user is logged out from DNN but the OAuth token is not expired on the OAuth provider. On this release, the logout process is fully implemented, so both the DNN cookie and the OAuth token are correctly expired. Check the video below:

Apart of these new features, other bugs have been fixed.

The project is available on GitHub as always:

Un saludo y happy coding!

DNN Redis Caching Provider 3.0

Hi again! Following up with the DNN module updates, I have done some modifications on the DNN Redis Caching provider so it can be configured now through the DNN Persona bar.

The summary of changes of this release is:

  • Changed the minimum required DNN version to 9.0.1
  • Refreshed nuget packages including the latest version of StackExchange.Redis 1.2.6
  • Added configuration UI in the Persona Bar. Now the provider is not automatically enabled during installation, you need to use the new UI.
  • The Redis client automatically reconnects after a Redis connection failure

RedisCaching

If you have any interesting idea to add on the settings area, please let me know. I have thought on implementing a redis-cli command line interface, but I believe that would be better to wait for DNN 9.2 and implement that as a “DNN Prompt” command (see http://www.dnnsoftware.com/community-blog/cid/155417/dnn-prompt-making-dnn-admins-power-admins-via-the-command-line)

Hope this helps.

Un saludo and happy coding!

Building an Evoq Liquid Content chatbot with Azure Bot Service

dnnrecipes2A few months back we spent some days working in an internal hackathon at DNN Corp. called DNN Developer Days that allowed to explore the power of Evoq Liquid Content APIs. The result was an awesome set of project examples giving a glimpse of what you can do when integrating the APIs to publish and reuse the content through different channels, such as Amazon Echo, Azure Bot Service or a smart TV, including AI and machine learning capabilities such as tagging your site images automatically. You can explore all of them at http://builtwithdnn.com website, I hope you find them interesting.

I personally worked on an Azure Bot Service chatbot, exposing the contents of a recipes website through different channels like Skype, Facebook Messenger, Telegram or Teams. You can quickly play with the example at http://builtwithdnn.com/recipesbot/ 

IMG_0365 IMG_0366 IMG_0367

All the documentation and source code for each project is available on a GitHub repository at https://github.com/dnnsoftware/Dnn.Evoq.LiquidContent.Samples.Public/ 

Chatbot as a Channel: Integrating Liquid Content with Azure Bot Service

Integrating Artificial Intelligence (AI) into your site or application is easier than you think. Companies like Google, Microsoft and Amazon are making big bets on AI and machine learning. All three companies provide freely available toolkits and services that developers can use to integrate AI into our applications.

The recipes bot integrates with messaging apps like Skype, Telegram or Facebook Messenger. As a Skype user, for example, you can provide the bot with a list of ingredients. The bot connects to Liquid Content to retrieve recipes that contain those ingredients, then provides an answer to the Skype user.

This is just the beginning. I plan to connect more third-party services to the recipe bot, such as LUIS, Cortana, Vision API and Apple Pay.

Bookmark this page and check back here to get updates. I’ll update the page each time there’s news to share. You can also follow me on Twitter @davidjrh and I will mention the new additions there.

Building the Recipes Bot

In this tutorial, I show you how I built the recipe bot. The bot uses the Azure Bot Service, the intelligent, serverless bot service that scales on demand. With this service, you will be able to publish your bot through multiple channels without managing or patching any server. It can connect to additional messaging apps without writing or adding new code. The service is free, and you will only pay for the resources you consume.

ArchitectureGetting Started

Prerequisites

To follow the tutorial, you will need:

  • An active Azure subscription. You will need to login as owner of the subscription to host the bot service. You can sign up for a free subscription here.
  • An Evoq Content or Engage site with Evoq Liquid Content enabled

Also, to debug your C# bot locally using breakpoints in Visual Studio you will need:

For more information about how to setup the debugging environment, visit Debug with the emulator and Debug an Azure Bot Service C# bot on the bot framework documentation.

Content Index

  1. Setting up the recipes
  2. Creating the basic bot
  3. Setup continuous integration
  4. Debugging the bot on your local environment
  5. Customizing the basic bot
  6. Test your recipes bot
  7. Adding a webchat in your site
  8. Known issues

Hope this helps.

Un saludo and happy coding!

DNN Application Insights v3.0

Hi all. After a while working with React/Redux and the new DNN PersonaBar model introduced on DNN Platform 9.0, I have started to update all my modules to avoid the use of the disappeared Host menu on DNN.

The first module updated has been the DNN Application Insights now available for download on GitHub. This includes:

  • Previous package updates to support version 2.4.0 of Application Insights, including enhanced live metrics streaming (committed by Mitchel Sellers)
  • New UI on the persona bar to setup the module settings
  • Requiring DNN Platform 9.0.1 or later

AppInsights_InstrumentationKey3

If you have any interesting idea to add on the settings area, please let me know. I have been checking all the configuration settings that can be done through the Application Insights configuration system and I will start adding them for the next release.

Updates for the Redis Caching Provider and Azure AD provider are the next ones.

Providing a GUID function in Azure Resource Manager templates with Azure Functions

arm-functionsSome time back while preparing the Global Azure Bootcamp Science Lab, I face the lack of some functions that are not available when authoring Azure Resource Manager templates. When creating some type of resources such as Batch jobs or RBAC related resources, you need to pass a GUID (Universally unique identifier) but there is no function to create them inside the template, so you need to pass them as template parameters making the result awful.

There is already a Feedback item describing the issue (please vote on https://feedback.azure.com/forums/281804-azure-resource-manager/suggestions/13067952-provide-guid-function-in-azure-resource-manager-te) and there are other similar issues when you need to use more sophisticated functions such as date related values.

When trying to provide a workaround, I finally found that could be solved with nested templates, so I started by building a simple ARM Guid template that could be referenced on your primary one. You can check that repo here https://github.com/davidjrh/azurerm-newguid

But when testing the GAB lab with millions of guids, found that from time to time that template was generating duplicated guids, so finally ended by implementing the lab ARM Guid templates by using a WebAPI. You can check the how the guid template was being used at https://github.com/intelequia/GAB2017ScienceLab/blob/master/lab/assets/GABClient.json#L100 (check lines 100 and 398), and the template API implementation available at https://github.com/intelequia/GAB2017ScienceLab/blob/master/src/GABBatchServer/src/GAB.BatchServer.API/Controllers/TemplatesController.cs

Building a GUID function for ARM templates with Azure Functions

Revisiting this today, I thought that could be more cost effective of having that dynamically generated template by using an Azure Function. Here is the step by step so you can deploy your own, and increase your ARM functions arsenal with the same approach.

Remember that the basic idea is to reference this function in your deployment, that will generate a GUID for later usage on your template.

  1. Create an Azure Function App through the Azure Portal.
  2. Add a function triggered by a HTTP request in C#
  3. Copy and paste the following code in the function body

using System.Net;
using System.Net.Http.Headers;


public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
     log.Info(«C# HTTP trigger function processed a request.»);


    // parse query parameter
     string numberOfGuids = req.GetQueryNameValuePairs()
         .FirstOrDefault(q => string.Compare(q.Key, «numberOfGuids», true) == 0)
         .Value;
     if (string.IsNullOrEmpty(numberOfGuids)) {
         numberOfGuids = «1»;
     }


    // validate the input
     int guids;
     if (!int.TryParse(numberOfGuids, out guids)) {
         return req.CreateResponse(HttpStatusCode.BadRequest, «Please pass a valid number on the query parameter ‘numberOfGuids'»);
     }
     if (guids <= 0 || guids > 1000) {
         return req.CreateResponse(HttpStatusCode.BadRequest, $»Invalid number of guids {guids}. Must be a number between 1 and 1000″);
     }


     // prepare the template
     var template = @»{
   «»$schema»»: «»
https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#»»,
   «»contentVersion»»: «»1.0.0.0″», «»parameters»»: {}, «»variables»»: {}, «»resources»»: [],
   «»outputs»»: {[OUTPUTS]}
}»;
     var outputs = await Task.Run<List<string>>(() =>
     {
         var o = new List<string>();
         for (var i = 0; i < guids; i++)
         {
             o.Add(@»»»guid» + i + @»»»: { «»type»»: «»string»», «»value»»: «»» + Guid.NewGuid() + @»»» }»);
         }
         return o;
     });
     var result = template.Replace(«[OUTPUTS]», string.Join(«,», outputs.ToArray()));


// return the response
return new HttpResponseMessage() {
     Content = new System.Net.Http.StringContent(
         result,
         System.Text.Encoding.UTF8,
         «application/json»
     )
};


}

AzurePortalFunction

Testing the function

Once you have saved the function, you can test it by doing a webrequest. In my case, when I call the URL https://armtemplates.azurewebsites.net/api/NewGuid?numberOfGuids=3 I get this answer:

NewGuidWebYou can pass the number of guids to generate as parameter.

Consuming the GUID template by using a nested template

As I documented on the initial GitHub repo, you can just follow these examples to consume then in different ways:

Example 1. Getting a GUID and using it later

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {},
  "variables": {},
  "resources": [ 
    { 
        "apiVersion": "2015-01-01", 
        "name": "MyGuid", 
        "type": "Microsoft.Resources/deployments", 
        "properties": { 
          "mode": "incremental", 
          "templateLink": {
            "uri": "https://armtemplates.azurewebsites.net/api/NewGuid",
            "contentVersion": "1.0.0.0"
          }
        } 
    } 
  ],
  "outputs": {
    "result": {
      "type": "string",
      "value": "[reference('MyGuid').outputs.guid0.value]"
    }
  }
}

Example 2. Getting 2 guids and using them later

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {},
  "variables": {},
  "resources": [ 
    { 
        "apiVersion": "2015-01-01", 
        "name": "MyGuids", 
        "type": "Microsoft.Resources/deployments", 
        "properties": { 
          "mode": "incremental", 
          "templateLink": {
            "uri": https://armtemplates.azurewebsites.net/api/NewGuid?numberOfGuids=2,
            "contentVersion": "1.0.0.0"
          }
        } 
    }  
  ],
  "outputs": {
    "result0": {
      "type": "string",
      "value": "[reference('MyGuids').outputs.guid0.value]"
    },
    "result1": {
      "type": "string",
      "value": "[reference('MyGuids').outputs.guid1.value]"
    }
  }
}

If you still have any doubt on how to consume the outputs, check the GAB Science Lab template on line 398 https://github.com/intelequia/GAB2017ScienceLab/blob/master/lab/assets/GABClient.json#L398

Hope this helps! Un saludo y happy coding!

[Video] Introducción a Application Insights

Se me había pasado de publicar en el blog el vídeo sobre Application Insights que edité el pasado mes de Mayo, pero mejor tarde que nunca Smile

Application Insights es un servicio de Application Performance Management (APM) extensible para desarrolladores web en varias plataformas, que sirve para supervisar aplicaciones web en directo. Se pueden detectar automáticamente anomalías de rendimiento además de proveer de herramientas de análisis que ayudan a diagnosticar incidencias comprendiendo lo que los usuarios están haciendo con la aplicación. En esta sesión veremos una introducción rápida al servicio viendo cómo se implementa y configura el servicio, tanto para aplicaciones web nuevas partiendo desde cero, como sitios web basados en CMS.

The storage account already exists error when redeploying an ARM template

Update 06 Jul 2016

Seems this is resolved on the latest Storage Resource Provider API 2016-01-01. The schema documented at https://azure.microsoft.com/en-us/documentation/articles/resource-manager-template-storage/ is for 2015-06-15, and that version didn’t support PUT operations. The new 2016-01-01 schema can be found at:

So this new template now works as expected and tags and other settings can be changed:

{

    «$schema»: «https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#»,

    «contentVersion»: «1.0.0.0»,

    «resources»: [

      {

        «name»: «dnntest20160705»,

        «type»: «Microsoft.Storage/storageAccounts»,

        «location»: «[resourceGroup().location]»,

        «apiVersion»: «2016-01-01»,

        «dependsOn»: [ ],

        «tags»: {

          «displayName»: «MyStorageAccount»

        },

        «sku»: {

          «name»: «Standard_LRS»

        },

        «kind»: «Storage»

      }

    ]

}

 

Notes:

  • Ensure you install the latest Azure PowerShell and SDKs or the latest 2016-01-01 version won’t be recognized
  • With the latest Azure SDK 2.9.1 available, I have syntax errors on the template, seems the schema was not included in the latest release. While you have syntax errors, you can deploy from Visual Studio with no problems. I suppose this will be fixed on the next Azure SDK release.

Thanks to Tom FitzMacken for the indications.

_____________________________________________________________________________

Original issue description

Seems that when deploying an storage account using an ARM template like the one below:

{

    «$schema»: «https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#»,

    «contentVersion»: «1.0.0.0»,

    «resources»: [

      {

        «name»: «dnntest20160705»,

        «type»: «Microsoft.Storage/storageAccounts»,

        «location»: «[resourceGroup().location]»,

        «apiVersion»: «2015-06-15»,

        «dependsOn»: [ ],

        «tags»: {

          «displayName»: «MyStorageAccount»

        },

        «properties»: {

          «accountType»: «Standard_LRS»

        }

      }

    ]

}

 

If you go and change the “displayName” tag value for something else or try to add a new tag, when you deploy the update I get the following deployment exception:

22:44:07 – [ERROR] New-AzureRmResourceGroupDeployment : 22:44:07 – Resource

22:44:07 – [ERROR] Microsoft.Storage/storageAccounts ‘dnntest20160705’ failed with message ‘{

22:44:07 – [ERROR]   «error»: {

22:44:07 – [ERROR]     «code»: «StorageAccountAlreadyExists»,

22:44:07 – [ERROR]     «message»: «The storage account named dnntest20160705 already exists under

22:44:07 – [ERROR] the subscription.»

22:44:07 – [ERROR]   }

22:44:07 – [ERROR] }’

22:44:07 – [ERROR] At D:\temp\azureresourcegroup2\Scripts\Deploy-AzureResourceGroup.ps1:98 char:1

Workaround

Change the tag value through the portal/PowerShell, so new deployment updates work.

 

This sounds like a bug, that will probably be resolved in a future release.

Password is expired when using Visual Studio Release Management

Today I was investigating an issue on Visual Studio Online Release Management, getting a deployment error related to the Azure credentials used for the deployment operation

Password is Expired summary

When going to the log details, the error happens on a resource manager task. The logs show that the password of the user account used to connect to Azure has expired.

Password is Expired

And here comes something to highlight, because the configuration of the service connection between VS Online and Azure has evolved in the latest months, to support both Azure Classic and Resource Manager models. Also note that the tasks you can configure on Visual Studio Release Management can work with one of these models or both. Let me show with an example.

Configuring Azure service connection on VS Online Release Management

To setup the Azure connection on Release Management, you need to click on the “Manage Project”

ManageProject

Once there, go the “Services” tab and when clicking on the “New Service Endpoint” you will see two ways to connect to Azure: Azure Classic and the new Azure Resource Manager. A few months back there was an only option to support Classic and RM scenarios, but this changed later. The main difference now is the way each connection authenticates with Azure

SNAGHTMLd1dcd0b

So a big difference here is the use of a service principal on Azure Resource Manager connections instead of using a user principal when using Azure Classic connections. This is important for our case, because the “Password is expired” error message we got refers to a user principal, not to a service principal where the “password” and “expiration” concepts are different.

Note that depending on the task used on Release Management, you can use one or other connection, or only the Classic connection. For example:

Fixing the “password is expired” issue

Once we understand these concepts, we have just to fix the password expiration issue. The solution has two steps:

  1. Change the password of the user principal and then update the Azure Classic connections with the new password. Note: You should use a long/very strong password for these user principals because of the second step; service principals come to play to stop using user principals in the future;
  2. To avoid this to happen again, change the password expiration policy for this account to don’t expire.

The first step can be easily done manually. For the second step, we need the help of Azure AD PowerShell module.

Install Azure AD PowerShell module

On this MSDN article you can find all the information related to managing Azure AD via PowerShell: https://msdn.microsoft.com/en-us/library/jj151815.aspx The Azure AD module is supported on the following Windows operating systems with the default version of Microsoft .NET Framework and Windows PowerShell: Windows 8.1, Windows 8, Windows 7, Windows Server 2012 R2, Windows Server 2012, or Windows Server 2008 R2. You need to install two utilities:

Connecting to Azure AD

Once you have installed both utilities, to start using all the Azure AD cmdlets just open a PowerShell command prompt.

$msolcred = get-credential
connect-msolservice -credential $msolcred

Obviously you need to introduce admin credentials if you want to use the administrative cmdlets later, like changing the password expiration policy.

Change the user principal password expiration policy

Once logged in, we can just change the password expiration policy for the user with this script:

# Gets the current policiy value
Get-MsolUser -UserPrincipalName «releasemanagement@mytenant.onmicrosoft.com» | select PasswordNeverExpires

# Changes the policy to never expire
Get-MsolUser -UserPrincipalName «releasemanagement@mytenant.onmicrosoft.com» | Set-MsolUser -PasswordNeverExpires $true

There is a good blog post about this at https://azure.microsoft.com/en-us/documentation/articles/active-directory-passwords-set-expiration-policy/

What happens with service principals? Passwords never expire?

Service principals works in a different way. When you create a service principal, you can specify the StartDate and EndDate for the security principal credential (by default, StartDate=Now; EndDate = Now + 1 Year). You can change the EndDate in a similar way (is not a boolean flag, you need to set the EndDate).

For more information, visit the MSDN article https://msdn.microsoft.com/en-us/library/dn194091.aspx

Azure Bootcamp 2016: cierre y hasta el año que viene!

Después de la resaca que nos ha dejado el Global Azure Bootcamp 2016, no nos queda más que dar las gracias a todos los que habéis puesto vuestro granito de arena en lo que ha sido la mejor edición de todos los bootcamps que hemos realizado durante los últimos años. Comenzamos en 2014 con cerca de 140 asistentes, el año pasado pasamos de 200, y este año, las cifras se han disparado a lo siguiente:
· 450 registrados 1 mes antes de la fecha del evento, llegando al límite de capacidad de las instalaciones. Al final sobre 390 asistentes en total
· 3 tracks de sesiones haciendo un total de 21 sesiones entre una pre-selección de 60 charlas. Todos los recursos de las mismas ya están enlazados desde la agenda en http://azurebootcamp.es
· Emisión en directo a través de Channel9, quedando publicadas en el mismo canal http://aka.ms/azurebootcampstreaming
· 13 patrocinadores duplicando el número de la última edición
· 1 Lab Global (RacingLab con backend en Azure) con 365 participantes simultáneos
· 187 localizaciones en 62 países alrededor del mundo
Hemos recibido feedback muy positivo y hemos tomado nota de algunas incidencias a mejorar en la próxima edición, estoy ya no hay quien lo pare. También hemos subido algunas fotos del evento que ya están enlazadas también desde la propia web del Azure Bootcamp, incluyendo la foto finish, de los que se quedaron hasta más de las 19:00 horas para el sorteo final (tenemos grabado El Preguntón en Channel 9!).
Una vez más, gracias a todos por vuestra asistencia, colaboración y sobretodo, por habérnoslo hecho pasar en grande. Nos vemos en 2017.
clip_image002