Quantcast
Channel: Randy Riness @ SPSCC aggregator
Viewing all 3015 articles
Browse latest View live

MSDN Blogs: BUILD BOT with Microsoft Bot Framework Rest Api

$
0
0

This post describes the basic steps to build your bot using Microsoft Bot Framework 3 REST API.
You know that it’s very easy to build the bot by using language SDKs (.NET, Node.js). But, if you are using other programming languages like PHP, Ruby, Python with scientific libraries, etc, this post could be helpful for your development.

If you have ever built your bot using Skype Bot Platform or Microsoft Bot Framework 1.0, there exists one important notation for the new Microsoft Bot Framework 3.
Microsoft Bot Framework is basically the common developer platform for building and hosting your bot (the concept is similar with Hubot), and this connects to each bot infrastructures (Slack Bot, Skype Bot, etc). Therefore, in the v1 endpoint, Microsoft Bot Framework and Skype Bot Platform was separated each other, and you can build your Skype bot using either Bot Framework or Skype Bot Platform.
On the other hand, Microsoft Bot Framework version 3 has included the developer platform for “Skype Bot”, and all Skype bot’s set-up can be done by Microsoft Bot Framework. i.e, If you want to build your Skype bot, you just simply use this Microsoft Bot Framework 3 only. In addition, as you can see later in this post, several concepts of Bot Framework (calling pattern, authentication, etc) is inheriting from the original Skype Bot Platform. (The platform design has changed from v1.)

Notice : Microsoft Bot Framework v2 is not public (internal build). You can use v1 and v3.

Overview of Call Flow

Before you start, you must register your bot in dev portal (https://dev.botframework.com/bots). In this blog post, we assume that this registration is all done.
When you register your bot, your bot is also registered in App Registration Portal (https://apps.dev.microsoft.com) and you can get the client id and client secret for your bot. This data (client id and client secret) is used for the authentication which I describe later.

The following picture illustrates the calling flow of the Microsoft Bot Framework.
Microsoft Bot Framework provides the basic platform for bot hosting and connects to the each communication channel (Slack, Skype, etc) fronting on end-users through Bot Connector. Your code (your bot) interacts with this Microsoft Bot Framework in the backend. That is, your code (your bot) must communicate with Microsoft Bot Framework only.

If you connect to the channels (Slack, Facebook Messenger, etc), you must set up in the portal (https://dev.botframework.com/bots). But Skype Bot infrastructure (Skype channel) is initially connected to the Microsoft Bot Framework. (No need to extra work except for publishing to the Skype bot directory.)

Authentication Flow (outgoing – your code to bot framework)

Before starting communications, you must learn about the authentication for secure communications.

The messages to Bot Framework (from your bot) must be protected by Azure AD v2 endpoint, otherwise the malicious code might call the Microsoft Bot Framework instead of your bot.
In this section, I explain about how to accomplish this flow.

The Bot Framework uses the app-only access token in Azure AD v2 endpoint. To get this kind of access token, you just send the HTTP request as follows.
As I described before, you must retrieve the client_id and client_secret from the app registration portal beforehand and set as follows.

POST https://login.microsoftonline.com/common/oauth2/v2.0/token
Content-Type: application/x-www-form-urlencoded

grant_type=client_credentials&client_id=1f7dd6e9-cbd7-4c38-adf2-6e9bcab5310a&client_secret=6wyxeJ4...&scope=https%3A%2F%2Fgraph.microsoft.com%2F.default

As a result you can receive the following HTTP response, and this includes the following access token. (Note that this access token expires in 1 hour.)

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8

{
  "token_type": "Bearer","expires_in": 3599,"ext_expires_in": 0,"access_token": "eyJ0eXAiOi..."
}

Hold this access token in your bot application, because you must always set this access token as “Authorization” header for your outgoing messages as follows. And Microsoft Bot Framework verifies this token for checking whether this request is sent from the valid (registered) bot.

Notice : This access token also includes the claims (client id, expiration, etc) for communications, and the Microsoft Bot Framework can identify your bot using this access token. (Please see the previous post of “How to verify the OAuth token with the v2.0 endpoint“.)

POST https://skype.botframework.com/v3/conversations/29%3A1iFtpwQ.../activities/6Bt4f5iryCIAuthorization: Bearer eyJ0eXAiOi...Content-Type: application/json; charset=utf-8

{
  "type": "message","timestamp": "2016-08-18T09:22:54.1811797Z","from": {"id": "28:1f7dd6e9-cbd7-4c38-adf2-6e9bcab5310a","name": "Echo Bot"
  },"conversation": {"id": "29:1iFtpwQ..."
  },"recipient": {"id": "29:1iFtpwQ...","name": "Tsuyoshi Matsuzaki"
  },"text": "Hello !","replyToId": "6Bt4f5iryCI"
}

Authentication Flow (incoming – bot framework to your code)

The message from Microsoft Bot Framework is also protected by Authorization header as follows. (see the following header in bold fonts.) In this case, your bot must verify the message for secure communication. (If you ignored this header, your code might be called by the malicious code.)

POST https://example.com/yourbotAuthorization: Bearer eyJ0eXAiOi...Content-Type: application/json; charset=utf-8

{
  "type": "contactRelationUpdate","id": "6Bt4f5iryCI","timestamp": "2016-08-18T09:22:50.927Z","serviceUrl": "https://skype.botframework.com","channelId": "skype","from": {"id": "29:1iFtpwQ...","name": "Tsuyoshi Matsuzaki"
  },"conversation": {"id": "29:1iFtpwQ..."
  },"recipient": {"id": "28:1f7dd6e9-cbd7-4c38-adf2-6e9bcab5310a","name": "Echo Bot"
  },"action": "add"
}

How to verify this header value ?
In this case, the Azure AD is not used. The key in Bot Framework is used for this authentication (AuthN and AuthZ).
First, you must retrieve the OpenID / OAuth configuration information hosted at https://api.aps.skype.com/v1/.well-known/openidconfiguration. It returns as follows. (Note that this might change in the future, then don’t copy this json result in your production code.)

Notice : If you’re using Emulator (debugging), you must use https://login.microsoftonline.com/common/v2.0/.well-known/openid-configuration (Azure AD v2) instead of https://api.aps.skype.com/v1/.well-known/openidconfiguration.

GET https://api.aps.skype.com/v1/.well-known/openidconfiguration
HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8

{
  "issuer": "https://api.botframework.com","authorization_endpoint": "https://invalid.botframework.com","jwks_uri": "https://api.aps.skype.com/v1/keys","id_token_signing_alg_values_supported": ["RSA256"
  ],"token_endpoint_auth_methods_supported": ["private_key_jwt"
  ]
}

The public key (X.509 certificate) is stored in the previous “jwks_uri” location. Then you must retrieve the key list and verify the previous “Authorization” header (access token).
As I explained in the previous post of “How to verify the OAuth token with the v2.0 endpoint“, this verification is accomplished by the simple steps.
Here is the PHP example for this verification.

<?php
// Authorization header value
$token = "eyJ0eXAiOi...";
// 0:Invalid, 1:Valid
$token_valid = 0;
// 1 separate token by dot (.)
$token_arr = explode('.', $token);
$headers_enc = $token_arr[0];
$claims_enc = $token_arr[1];
$sig_enc = $token_arr[2];

// 2 base 64 url decoding
$headers_arr = json_decode(base64_url_decode($headers_enc), TRUE);
$claims_arr = json_decode(base64_url_decode($claims_enc), TRUE);
$sig = base64_url_decode($sig_enc);

// 3 get key list
$keylist = file_get_contents('https://api.aps.skype.com/v1/keys');
$keylist_arr = json_decode($keylist, TRUE);
foreach($keylist_arr['keys'] as $key => $value) {
  // 4 select one key (which matches)
  if($value['kid'] == $headers_arr['kid']) {
    // 5 get public key from key info
    $cert_txt = '-----BEGIN CERTIFICATE-----' . "n" . chunk_split($value['x5c'][0], 64) . '-----END CERTIFICATE-----';
    $cert_obj = openssl_x509_read($cert_txt);
    $pkey_obj = openssl_pkey_get_public($cert_obj);
    $pkey_arr = openssl_pkey_get_details($pkey_obj);
    $pkey_txt = $pkey_arr['key'];
    // 6 verify signature
    $token_valid = openssl_verify($headers_enc . '.' . $claims_enc, $sig, $pkey_txt, OPENSSL_ALGO_SHA256);
  }
}

// 7 show result
if($token_valid == 1)
  echo 'Token is Valid';
else
  echo 'Token is Invalid';

// Helper functions
function base64_url_decode($arg) {
  $res = $arg;
  $res = str_replace('-', '+', $res);
  $res = str_replace('_', '/', $res);
  switch (strlen($res) % 4) {
    case 0:
    break;
    case 2:
    $res .= "==";
    break;
    case 3:
    $res .= "=";
    break;
    default:
    break;
  }
  $res = base64_decode($res);
  return $res;
}
?>

Messaging Flow – Adding your bot

The authentication flow is all done ! All you have to do is to communicate using HTTP (REST-styled messaging) with Microsoft Bot Framework.
Let’s see this.

First, if your bot is added (subscribed) by a user, the following HTTP webhook arrives to your bot endpoint. As I explained in the above, you must check the following “Authorization” header value and proceed your arbitrary actions.

Notice : In this example, I’m using the Skype.

POST https://example.com/yourbot
Authorization: Bearer eyJ0eXAiOi...
Content-Type: application/json; charset=utf-8

{
"type": "contactRelationUpdate","id": "6Bt4f5iryCI","timestamp": "2016-08-18T09:22:50.927Z","serviceUrl": "https://skype.botframework.com","channelId": "skype","from": {"id": "29:1iFtpwQ...","name": "Tsuyoshi Matsuzaki"
  },"conversation": {"id": "29:1iFtpwQ..."
  },"recipient": {"id": "28:1f7dd6e9-cbd7-4c38-adf2-6e9bcab5310a","name": "Echo Bot"
  },"action": "add"}

The “type” and “action” attributes mean what kind of bot action is published. In this case, this means that your bot is added to the user’s contact list.

The “from” attribute is the user id. In this case, this means the Skype user which id is “1iFtpwQ…”. Your bot must store this “from” id in your database, because your bot can communicate with each bot’s user (bot’s subscriber) using this id.
The “recipient” attribute is the destination id. In this example this indicates your bot which client id is “1f7dd6e9-cbd7-4c38-adf2-6e9bcab5310a”.
The “id” attribute is called activity id. Sometimes this id is refered by other communication. (I show you later.)

Notice : The number of “29” means the Skype user, and “28” means the bot.

If your bot accepts this request, you just response HTTP status 202.

HTTP/1.1 202 Accepted

Of course, you can reply some messages (for example, bot usage info, etc) against this adding message, and I will show you how to post outgoing messages later.

When your bot is removed from the contact list of some user, the following HTTP request (webhook) is received. (As you can see, the action attribute is set as “remove”.)
In this case you also response the HTTP status 202 as successful response.

POST https://example.com/yourbot
Authorization: Bearer eyJ0eXAiOi...
Content-Type: application/json; charset=utf-8

{
"type": "contactRelationUpdate","id": "X4KtWvi9XS","timestamp": "2016-08-18T09:48:19.201Z","serviceUrl": "https://skype.botframework.com","channelId": "skype","from": {"id": "29:1iFtpwQ...","name": "Tsuyoshi Matsuzaki"
  },"conversation": {"id": "29:1iFtpwQ..."
  },"recipient": {"id": "28:1f7dd6e9-cbd7-4c38-adf2-6e9bcab5310a","name": "Echo Bot"
  },"action": "remove"}

Messaging Flow – Incoming message

If the user sends the message “Good morning !” to your bot, the following HTTP webhook arrives.

POST https://example.com/yourbot
Authorization: Bearer eyJ0eXAiOi...
Content-Type: application/json; charset=utf-8

{
  "type": "message","id": "4GhGAlkzDAK2I5lw","timestamp": "2016-08-18T09:31:31.756Z","serviceUrl": "https://skype.botframework.com","channelId": "skype","from": {"id": "29:1iFtpwQ...","name": "Tsuyoshi Matsuzaki"
  },"conversation": {"id": "29:1iFtpwQ..."
  },"recipient": {"id": "28:1f7dd6e9-cbd7-4c38-adf2-6e9bcab5310a","name": "Echo Bot"
  },"text": "Good morning !","entities": []
}

This incoming message is very similar to the previous one, and I think there’s no need to explain about details.
If your bot accepts this request, you just response HTTP status 202.

HTTP/1.1 202 Accepted

Messaging Flow – Outgoing message

On the other hand, when your code send the outgoing message (which is the message from your bot to the user), you send the following HTTP request to Microsoft Bot Framework.

POST https://skype.botframework.com/v3/conversations/29%3A1iFtpwQ.../activities/4GhGAlkzDAK2I5lw
Authorization: Bearer eyJ0eXAiOi...
Content-Type: application/json; charset=utf-8

{
  "type": "message","timestamp": "2016-08-18T09:31:36.2281894Z","from": {"id": "28:1f7dd6e9-cbd7-4c38-adf2-6e9bcab5310a","name": "Echo Bot"
  },"conversation": {"id": "29:1iFtpwQ..."
  },"recipient": {"id": "29:1iFtpwQ...","name": "Tsuyoshi Matsuzaki"
  },"text": "Good morning !","replyToId": "4GhGAlkzDAK2I5lw"
}

The “29%3A1iFtpwQ…” in the uri fragment (which is url-encoded) is the conversation id. When your bot is sending the message to some user, this conversation id is the user id itself.
You can respond (reply) against some incoming message (i.e, bidirectional messaging). The above “4GhGAlkzDAK2I5lw” is the incoming “id” attribute (i.e, activity id), and this sample is responding against this incoming message.

On the contrary, you can call the user using one-way style messaging like timer bot or some notification bot. If you do so, you must use the activity id with blank as follows.

POST https://skype.botframework.com/v3/conversations/29%3A1iFtpwQ.../activities
Authorization: Bearer eyJ0eXAiOi...
Content-Type: application/json; charset=utf-8

{
  "type": "message","timestamp": "2016-08-18T09:31:36.2281894Z","from": {"id": "28:1f7dd6e9-cbd7-4c38-adf2-6e9bcab5310a","name": "Echo Bot"
  },"conversation": {"id": "29:1iFtpwQ..."
  },"recipient": {"id": "29:1iFtpwQ...","name": "Tsuyoshi Matsuzaki"
  },"text": "Good morning !"
}

If Microsoft Bot Framework accepts this message, HTTP status 202 is returned. (As I explained, the “Authorization” header is checked by the framework.)

HTTP/1.1 202 Accepted

State Handling

Microsoft Bot Framework itself is having the state infrastructure (called “Bot State Service”). With this infrastructure you can build the stateful bot with scaling.
Now I show you how to use this state with rest api.

When you want to set user state in Bot State Service, you send the following HTTP request against Bot State Service endpoint (https://state.botframework.com).
The uri must be /v3/botstate/{channelId}/users/{userId}. The following example is using the skype as the bot channel.

POST https://state.botframework.com/v3/botstate/skype/users/29%3A1iFtpwQ...
Authorization: Bearer eyJ0eXAiOi...
Content-Type: application/json; charset=utf-8

{
  "eTag": "*","data": {"DemoData1": "Test Data 1"
  }
}
HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8

{
  "data": {"DemoData1": "Test Data 1"
  },"eTag": "W/"datetime'2016-08-18T10%3A12%3A45.4398159Z'""
}

Saved data is stored in the state service, and you can pick up the state data by calling GET method.

GET https://state.botframework.com/v3/botstate/skype/users/29%3A1iFtpwQ...
Authorization: Bearer eyJ0eXAiOi...
HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8

{
  "data": {"DemoData1": "Test Data 1"
  },"eTag": "W/"datetime'2016-08-18T10%3A12%3A45.4398159Z'""
}

The bot state is having 2 types of scope. One is the user-scoped state (the same as previous example), and another is the conversation-scoped state.
If you want to use the conversation-scoped state, you send the HTTP request to /v3/botstate/{channelId}/conversations/{conversationId} instead of /v3/botstate/{channelId}/users/{userId} as follows.

POST https://state.botframework.com/v3/botstate/skype/conversations/29%3A1iFtpwQ...
Authorization: Bearer eyJ0eXAiOi...
Content-Type: application/json; charset=utf-8

{
  "eTag": "*","data": {"DemoData2": "Test Data 2"
  }
}

Note that these data objects will fail to be stored if another instance of your bot has changed the object already.

Notice : Even if it’s the user state, the state is not shared with another bot. (The different state in the different bot.)

 

In this blog post I’ve just explained the basic concept and steps building your bot with rest api. As you see, don’t worry about the supported programming language in SDK, and please enjoy your bot development everyone !
Microsoft Bot Framework can handle more advanced scenarios like the attachment, rich text, card, and audio, etc. (The video in Skype will be available in the future release.) Next I will explain about these advanced scenarios.

For more details (api reference, arguments, etc), please see the following official document.  Thanks

Bot REST API Authentication
https://docs.botframework.com/en-us/restapi/authentication/

Bot Connector REST API 3.0
https://docs.botframework.com/en-us/restapi/connector/

Bot State REST API 3.0
https://docs.botframework.com/en-us/restapi/state/


MSDN Blogs: SharePoint Framework Developer Preview je tu!

$
0
0

SharePoint-Framework-Developer-Preview

Včera Microsoft vydal dlouho očekáváný preview pro vývojáře SharePoint Frameworku. Pomocí SharePoint Frameworku mohou vývojáři začít vytvářet web party, které běží u klienta (v JavaScriptu), využívat moderních frameworků, jako například AngularJS nebo React s pomocí čerstvě vydaného Office Fabric UI for React. Vše je možné testovat ve velmi odlehčeném prostředí zvaném SharePoint Workbench – není již potřeba instalovat celý SharePoint Server pro testování web partů nebo oproti vývojářskému tenantu v Office 365. Pokud se chcete do vývoje hned pustit, na GitHubu najdete návod jak začít.

spfx-workbench-todo-webpart-1

Více se můžete dozvědět z oficiálního oznámení.

MSDN Blogs: Are you coming to the Microsoft Australia Partner Conference 2016?

$
0
0

image

It’s only 4 weeks until the Microsoft Annual Partner Conference in the Gold Coast on the 5th to 8th September 2016. I thought I’d have a good long look at the agenda, and pull out the critical sessions that I think will be invaluable to Microsoft partners who sell into Education – both for this year’s sales, and longer term strategy. There’s nearly 100 different sessions happening during the four day conference, with specific sessions for sales or marketing professionals, directors and strategists. As with previous years there are some recognisable national and international speakers and thought leaders.

So here’s my take on a proposed route though the agenda for any partner that wants to build their Education business over the next 12 months and beyond. Have a look below at the sessions – you can visit the APC website for more information on the sessions, the speakers, and to start planning how to make most effective use of the time you’re investing in the conference:

Monday:

  • Welcome Keynote and Microsoft Australia Partner Awards (5- 6:30 pm)

Tuesday:

  • Keynote: Defining Your Future in Partnership (8:45- 9:45 am)
  • Build a Profitable Cloud Business with Microsoft (10-10:45 am)
    or
    Growing your Linux and Open Source Cloud Business on Azure (10-10:45 am)
  • Moving Beyond Analytics to Enable Digital Transformation (11:15-12pm)
  • Creating effective marketing campaigns to sell technology solutions (1-1:45pm)
  • Education – Digital Disruption to the extreme(2-2:45pm)
  • Unlock Real Value through Differentiation (3:15-4pm)
  • Keynote: Unleashing Innate Creativity for Innovation (4:15-5pm)

Wednesday

  • Keynote: Beyond Boring Business (8:45-9:45 am)
  • The Future is Smart: How the Internet of Things is changing the way we live and work (10-10:45am)
  • Business Design – how to turn your ideas into a successful business (11:15-12pm)
  • Keynote: The Business of Being Human (1-1:45pm)
  • Building a profitable Azure practice (2-2:45pm)
  • Effective co-selling with Microsoft’s Education business on Office 365 and Azure (3:15-4pm)
  • Driving Great Profit Out of Your Existing Office 365 Customer or Transition to Becoming a MSP (4:15-5pm)

Thursday

  • Keynote: Unlock the financial potential of your future leaders (10-10:45am)
  • When Humans and Technology collide (11:30-12:15)
    or
    Masterclass: Building a profitable Azure practice (11:30am-1:30pm)
  • Beyond the hype of artificial intelligence (12:30- 1:15pm)
  • The Analytics opportunity for every partner in Education(2:15-3pm)
  • Kat Holmes Design Thinking Final Keynote (3:15-4:15pm)

Of course, you won’t want to miss out on the three Education specific sessions in the agenda, in bold above, where you’ll find me and my colleagues sharing insights into how to meet the needs of education customers in Australia, and what strategies can help you best respond to market trends.

See you in the Gold Coast!

MakeADateMake a date: Find out more, and register for APC 2016

MSDN Blogs: Small Basic Gurus, step up and be known!

$
0
0
August Gurus step up and show us your knowledge on the latest and the greatest technologies Microsoft have to offer!

And for your efforts, eminent leaders in your technology will evaluate your contributions and award real virtual medals!

All you have to do is add an article to TechNet Wiki from your own specialist field. Something that fits into one of the categories listed on the submissions page. Copy in your own blog posts, a forum solution, a white paper, or just something you had to solve for your own day’s work today.

Drop us some nifty knowledge, or superb snippets, and become MICROSOFT TECHNOLOGY GURU OF THE MONTH!

This is an official Microsoft TechNet recognition, where people such as yourselves can truly get noticed!

HOW TO WIN

1) Please copy over your Microsoft technical solutions and revelations to TechNet Wiki.

2) Add a link to it on THIS WIKI COMPETITION PAGE (so we know you’ve contributed)

3) Every month, we will highlight your contributions, and select a “Guru of the Month” in each technology.

If you win, we will sing your praises in blogs and forums, similar to the weekly contributor awards. Once “on our radar” and making your mark, you will probably be interviewed for your greatness, and maybe eventually even invited into other inner TechNet/MSDN circles!

Winning this award in your favoured technology will help us learn the active members in each community.

Below are June’s mighty winners and contenders!

Guru Award Small Basic Technical Guru – June 2016 
GOLDNonki TakahashiSmall Basic: MouseEd Price: “I love this series! Nonki takes us end to end on using the mouse in Small Basic. I love the definition! Very strong See Also section.”
Michiel Van Hoorn: “Nonki again delivered a great article on a SB function (MOUSE). Very useful if you want to interact with mouse”
Carmelo La Monica: “I don’t know small basic, but this is good point to learn. Congrats.”
Alan Carlos: “”
SILVERDevaSmall Basic VideosCarmelo La Monica: “Very good videos, good fo to learn Small Basic. Congrats for all videos.”
Alan Carlos: “Good guidance, very helpful!”
Michiel Van Hoorn: “Nice overview of available videos”
Ed Price: “Great to have this list on the Wiki! Thanks!”

 

Small and Basically yours,

  • Peter Laker (Microsoft MVP) &
  • Ninja Ed

MSDN Blogs: Using Regular Expressions and Event Viewer with PowerShell

$
0
0
Today a customer asked me to assist him to filter an Event Viewer log file.
The customer figured out a lot of connection’s errors with the Oracle database. The errors were being written to the Event Viewer and, as the number of errors was huge, so it was hard to customer to identify all the errors were of the same type (ORA-02067).
I informed the customer that it was possible to use PowerShell and Regular Expressions (RegEx) to help to retrieve the Event Viewer logs.
Let me show an example of how to retrieve the records of the Application log of Event Viewer. I am filtering the logs by the current date and putting the result in a variable.
$logs = Get-EventLog -LogName Application | ? { ($_.TimeWritten -ge $(get-date).Date )  }
But the customer told me that he had the logs in an Event Viewer file (evtx). Therefore, to read the file and store the values in a variable, I used the following command:
$logs = Get-WinEvent -Path C:tempevents.evtx
I already knew that most of the Event Viewer events contained the following error message:
Exception message: ORA-02067: transaction or savepoint rollback required
I would like to check if there were other errors Oracle, for example, ORA-0205, rather than ORA-02067. Therefore, I used the following RegEx to extract the value of ORA-?????:
“ORA-[d.]+”

Through this pattern, I seek for the word ORA followed by a dash ( “-“) and at least one decimal.

To avoid receiving the entire log message, for the registries that satisfy the rule, and have to extract only the value “ORA-?????” within the message, I assigned an identifier (errorcode) for the part I would like to get.

“(?<errorid>ORA-([d.]+))”
To search the pattern in the $log array, I used a loop (foreach) where I search item by item and I stored the results found in another array ($ErrorCodes) for further grouping.

$errorcodes = @()
foreach ($log in $logs)
{
$log.message  -match “(?<errorid>ORA-([d.]+))” | Out-Null

if ($Matches.Count -gt 0)
{
$errorcodes += $matches[“errorid”]
}
}
$errorcodes | group | select name, count

For this scenario, I just found the ORA-02067 error as:
Name              Count
—–                  —–
ORA-02067   926
Another way to run the above command is through the PowerShell accelerator [RegEx], which is an abbreviation for System.Text.RegularExpressions.Regex class of the .NET Framework:
$errorcodes = @()
$Matches.Clear()
foreach ($log in $logs)
{
$address = [Regex]::Match($log.message, “ORA-([d.]+)”)
if ($address.Success)
{
$address.value
}
}
$errorcodes | group | select name, count
As now we know that there is only the ORA-02067 error, it is also possible to obtain data from the Event Log through the LIKE filter as:
($logs | ? message -like “*ORA-02067*”).count
I hope this post has been helpful. Be sure to consult other items available at:
For more details about Regular Express, take a look at:

Regular Expression Language – Quick Reference

https://msdn.microsoft.com/en-us/library/az24scfc(v=vs.110).aspx

MSDN Blogs: Creating an MA Extension

$
0
0

In this series of Post we will review and discuss the basic fundamentals of Rules Extensions, how to get started and how to develop , build, and deploy extensions. Please note that all Blog Postings, discussions and code within the Connector Space blog is to referenced at your own risk, every effort has been made to provide the most accurate information but at times types and other mistakes are made when writing these blog post so be sure to test all information especially code in your development environments prior to using within your production environment.

This post will focus on the Creation of the initial MA Extension; additional post will focus on the individual sections of the extension.

Additionally, I am writing this 1 MA Extension to be used across all my Management Agents of like types in my Environment, instead of writing individual extensions for each MA.

Additional links for understanding Management Agent rules extensions
Management Agent Rules Extensions

Considerations:

1.) How many “MA’s” (Management Agents) will be referencing this extension?

2.) Are there different types of Management Agents that need a custom extension written?

3.) Will this extension be used for all of your attribute flows or will you also be using Sync Rules created in the FIM / MIM Portal?

4.) How many total custom attribute flows will be needed?

From My Experience:

So I usually create 1 MA Extension to use for all like type MA’s, for example if I had 3 Active Directory Domain Services Management Agents and 2 SQL Server Management Agents I would create 1 Extension for the Active Directory Domain Services Management Agents and another for the SQL Management Agents.

Getting Started:

Open up the Synchronization Service

clip_image002

Click on the Management Agents Tab at the top

clip_image004

Click on the Management Agent you wish to create an extension for, in my example I will be creating 1 extension to be used for both Contoso ADMA and Fabrikam ADMA but I only select 1 of the MA’s.

clip_image006

Right click on the MA and select Create Extension Projects… followed by Rules Extension

clip_image008

A Create Extension Project (pop up window) will appear,

For programming language: Select “Visual C#”

Visual Studio Version: Select your version of Visual Studios you will be developing your code in.

Project name: If you’re only creating 1 extension be used (referenced) by all your MA’s you could use the default Name of “MAExtension” but if you have multiple Sync Solutions you may want to be more specific in the naming convention to avoid confusion.

Project location: By default, the local Documents folder will be pre populated but this can be changed to any desired location.

clip_image010

After you carefully selected all of your options and named your project, if Visual Studios is local to the Server you are on then leave the Launch in VS.Net IDE selected and click on Ok

After a few seconds your new Project should load into Visual Studios, (if the Launch in VS.Net IDE was selected.), if you chose not to launch Visual Studios or Visual Studios is not local to your server you will just need to Open up Visual Studios manually and import the project from its location.

clip_image012

Click on the .cs file

clip_image014

Your now ready to start developing your MA Extension.

Let’s take a look at the initial Code

using System;

using Microsoft.MetadirectoryServices;

namespace Mms_ManagementAgent_CrossForestADMAExtension

{

/// <summary>

/// Summary description for MAExtensionObject.

/// </summary>

public class MAExtensionObject : IMASynchronization

{

public MAExtensionObject()

{

//

// TODO: Add constructor logic here

//

}

void IMASynchronization.Initialize ()

{

//

// TODO: write initialization code

//

}

void IMASynchronization.Terminate ()

{

//

// TODO: write termination code

//

}

bool IMASynchronization.ShouldProjectToMV (CSEntry csentry, out string MVObjectType)

{

//

// TODO: Remove this throw statement if you implement this method

//

throw new EntryPointNotImplementedException();

}

DeprovisionAction IMASynchronization.Deprovision (CSEntry csentry)

{

//

// TODO: Remove this throw statement if you implement this method

//

throw new EntryPointNotImplementedException();

}

bool IMASynchronization.FilterForDisconnection (CSEntry csentry)

{

//

// TODO: write connector filter code

//

throw new EntryPointNotImplementedException();

}

void IMASynchronization.MapAttributesForJoin (string FlowRuleName, CSEntry csentry, ref ValueCollection values)

{

//

// TODO: write join mapping code

//

throw new EntryPointNotImplementedException();

}

bool IMASynchronization.ResolveJoinSearch (string joinCriteriaName, CSEntry csentry, MVEntry[] rgmventry, out int imventry, ref string MVObjectType)

{

//

// TODO: write join resolution code

//

throw new EntryPointNotImplementedException();

}

void IMASynchronization.MapAttributesForImport( string FlowRuleName, CSEntry csentry, MVEntry mventry)

{

//

// TODO: write your import attribute flow code

//

throw new EntryPointNotImplementedException();

}

void IMASynchronization.MapAttributesForExport (string FlowRuleName, MVEntry mventry, CSEntry csentry)

{

//

// TODO: write your export attribute flow code

//

throw new EntryPointNotImplementedException();

}

}

}

To continue with your MA Extension please visit the following post.

IMASynchronization.ShouldProjectToMV

IMASynchronization.MapAttributesForJoin

IMASynchronization.MapAttributesForImport

IMASynchronization.MapAttributesForExport

MSDN Blogs: Introducing the Visual Studio ALM Rangers – Igor Rosa Macedo

$
0
0

Who you are? WP_20140914_10_02_02_Pro

Igor Rosa Macedo, a DevOps and ALM consultant at ESX (a Brazilian Microsoft Partner). I am an enthusiast by technology and I believe in its use to improve the productive and evolutional process of people and enterprises. I’ve been working with TFS since TFS 2008. First as a user, and then as a consultant, giving trainings, making customizations and implantations.  I’m also a developer and I worked for several years developing code for Sharepoint and .Net applications.          

What makes you “tick”?

Learn new technologies, find a way to deliver value to people and build something that will make difference in their lives.

Where you live?

I live in Uberlândia-MG, Brazil

Where is the place you call home?

Uberlândia-MG, Brazil, with my family, friends and good food.

Why are you active in or with the Rangers program?

I’m new here, but I’ve been filling the gaps of TFS for my customers since 2008. Now I want to contribute with the community through the ALM Rangers projects.

FB_20141228_23_25_31_Saved_Picture

This post is part of an ongoing series of Rangers introductions. SeeRanger Index (Who is Who?)for more details.

MSDN Blogs: ADConnect und die Microsoft Cloud Deutschland

$
0
0

Nachdem ich gelesen habe, dass die neue Version von ADConnect (ab 1.1.180.0) jetzt auch die Microsoft Cloud Deutschland unterstützt, war meine Neugier geweckt, und was soll ich sagen, es klappt. Wie? Steht hier in Kurzform…

ADConnect ist der (würdige) Nachfolger von Windows Azure Active Directory Sync (DirSync) und Azure AD Sync und da die beiden letztgenannten Tools abgekündigt wurden und der Support April 2017 ausläuft, wäre es jetzt übrigens eine gute Zeit für den Wechsel… ADConnect gibt es zum Download hier: (https://www.microsoft.com/en-us/download/details.aspx?id=47594).

Mittels ADConnect kann man eine Synchronisation zwischen seinem on-premise Active Directory und dem Azure Active Directory herstellen. Hierzu gibt es verschiedene Konfigurationsmöglichkeiten, ich richte mal den einfachsten Fall ein, nämlich eine Synchronisation bestimmter Benutzer nach Azure, ohne Passwort.

Los geht’s

Für meinen Test habe ich mir in Azure einen Windows Server 2012 R2 installiert, den zum DC gemacht und ein paar Benutzer angelegt. Anschließend hab ich das ADConnect da drauf geladen. Wenn man – aber wer tut das schon – sich die Anleitungen vorher mal durchliest, dann trifft man auf eine Stelle, an der von verifizierten Domänen (“verified domains“) die Rede ist. Hm. Was ist das denn? Ganz einfach: Damit man im Azure AD einen Domänennamen verwenden kann für seine Benutzer, muss man erst mal beweisen, dass einem die Domäne auch gehört. Warum? Nun, es kann das Login ja nur einmal in Azure geben, und wenn jetzt irgendjemand sein lokales AD “wigand.de” genannt hat (und das kann man ja) und dann nach Azure die Benutzer synchronisiert, dann würde ich ziemlich alt aussehen, weil ich das dann nicht mehr könnte… Daher gibt es die Domänen-Verifikation, d.h. man beweist, dass man die Hoheit über den Domänennamen hat. Das geht auf mehrere Arten, die einfachste ist oft, einen TXT-Eintrag im DNS-Server zu machen, denn dann gehört mir offensichtlich die Domäne bzw. ich bin der Administrator derselben.

Das Problem, das sich hier für die Microsoft Cloud Deutschland stellt, ist, dass das gewohnte Frontend für diese Domänenverifizierung im Azure Portal für die Microsoft Cloud Deutschland noch nicht zur Verfügung steht… Aber wieder einmal rettet uns PowerShell aus diesem Dilemma mit den folgenden wenigen Schritten:

Verbinden mit MSOL

Erster Schritt ist, die Verbindung mit AzureAD herzustellen. Hierzu gibt es den Befehl Connect-MsolService (für die Preview-Kunden der Microsoft Cloud Deutschland gibt es dafür aktuell noch einen Ersatz). Nach dem erfolgreichen Verbinden können wir dann die ganzen Msol-Cmdlets verwenden wie Get-MsolUser, Get-MsolGroup, und zum Beispiel auch Get-MsolDomain. Wir probieren das letzte Cmdlet einfach mal aus und sehen unter anderem eine Spalte mit der Überschrift “Status”. Das erst mal nur zur Info, wir brauchen das später nochmal.

Neue Domäne hinzufügen

Als nächster Schritt kommt das Hinzufügen unserer gewünschten Domäne. Hierzu gibt es das Cmdlet New-MsolDomain mit ein paar Parametern.

  • -Authentication: Managed oder federated. Das bezieht sich darauf, wie wir mit dieser Domäne später arbeiten wollen, also ob wir eine Federation bauen möchten oder nicht. Ich hatte ja oben schon geschrieben, dass wir das erst mal langsam angehen lassen und daher machen wir keine Federation, sondern nehmen “managed“.
  • -Name: Der Name unserer Domäne, zum Beispiel wigand.de, oder eben auch eine Subdomäne wie ad.wigand.de
  • -VerificationMethod: Wie beweisen wir, dass wir die notwendigen Rechte an der Domäne haben? Hier wählen wir “DNSRecord“.

Unser Kommando sieht also so aus:

New-MsolDomain -Authentication Managed -Name ad.wigand.de -VerificationMethod DnsRecord

Wenn wir alles richtig eingegeben haben, dann sehen wir in der Ausgabe unsere neue Domäne mit dem Status “Unverified” und “Managed”.

Domäne verifizieren

Das verifizieren der Domäne geht in zwei Schritten, ok, eigentlich in drei Schritten:

Domänenverifikation starten

Azure gibt uns einen DNS-Eintrag, den wir in unserem DNS-Server eintragen müssen, und wenn wir das geschafft haben, dann glaubt uns Azure auch die Domäne. Also los. Wir können wählen zwischen einem MX- oder einem TXT-Eintrag, hier mal der TXT-Eintrag:

Get-MsolDomainVerificationDns -DomainName ad.wigand.de -Mode DnsTxtRecord

Bitte beachten: Hier heißt der Parameter jetzt “-DomainName“. In der Antwort sehen wir den geforderten Eintrag:

Label : ad.wigand.de
Text  : MS=ms59666891
Ttl   : 3600

DNS-Eintrag vornehmen

Da kann ich jetzt wenig helfen, wie auch immer man den Eintrag hinbekommt, sei es über die eigene Konfigurationsdatei, einem Web Frontend oder einem Serviceticket, auf jeden Fall brauchen wir in der Subdomain “ad.wigand.de” den TXT-Record mit dem Inhalt “MS=ms59666891” oder was auch immer da steht natürlich. Ist das geschehen, dann kann es weiter gehen. Eventuell lassen wir hier ein paar Minuten Zeit vergehen, bis alle Caches, Secondary DNS etc. sich über den neuen Eintrag einig sind…

Verifikation abschließen

Jetzt können wir wieder in PowerShell Azure auffordern, nach dem gewünschten TXT-Eintrag zu suchen:

Confirm-MsolDomain -DomainName ad.wigand.de

Kommt hier eine Fehlermeldung mit der Bitte, das später nochmal zu versuchen, dann haben wir entweder den Eintrag falsch gemacht (da muss vornedran wirklich “ms=” stehen!) oder der DNS-Server hat noch eine alte Antwort geliefert und wir müssen echt noch etwas warten. Ansonsten ist unsere Domäne verifiziert, was wir auch mit dem oben schon erwähnten Cmdlet Get-MsolDomainüberprüfen können (da steht dann ein “Verified” in der Status-Spalte).

ADConnect konfigurieren

Die Voraussetzungen sind geschaffen, wir können die Installation von ADConnect starten. Das steht wunderbar einfach hier) beschrieben. Ein paar Hinweise, wobei der Großteil einfach intuitiv eingegeben werden kann, zumindest bei der einfachen Variante ohne Passwort-Sync:

  • Bei “Connect to Azure AD” verwenden wir natürlich den gleichen Benutzernamen, mit dem wir uns oben mit Connect-MsolService angemeldet hatten.
  • Bei “Aure AD sign-in” sollten wir die Früchte unserer vorherigen Arbeit sehen können, denn dort wählen wir den UPN-Suffix aus, der für das Login verwendet werden soll, und da sollte auch unsere verifizierte Domäne auftauchen.
  • Filtern nach OU, festlegen des “Anker”-Attributs etc, das sind alles normale Einstellungen…

Fertig. Nach etwas Zeit können wir dann die on-premise AD Benutzer im AAD finden:

Get-MsolUser | where {$_.UserPrincipalName –like "*ad.wigand.de*"}

…und bitte nicht -contains” verwenden :-), sondern -like. Alle synchronisierten Einträge enthalten übrigens im Attribut “ImmutableID” die Original-GUID in Base64-Darstellung, bietet sich zum Beispiel als Filter-Kriterium an…


MSDN Blogs: Make a custom role for users to Create an Azure Data Factory

$
0
0

Azure built-in RBAC roles are pretty new and do not cover all the bases yet, so I needed to customize a role of our own to enable users to create data factories, without having to be co-admin of the subscription.

The Problem

The built-in role Data Factory Contributor does not include the necessary rights for a given user to create a New Data Factory. Role members can manage data factories, but CREATE is not listed for the data factory itself. See also: Data Factory Contributor https://azure.microsoft.com/en-us/documentation/articles/role-based-access-built-in-roles/

I didn’t want to grant everyone access to be a co-admin on the subscription so they could create new ones, so we opted for a custom role to achieve the goal. Reading the description of the role, this role does not mention that the user can create data factories, and I find it is true through testing.

When I try to use a non-admin account to create a Data Factory in the Azure portal at https://portal.azure.com I get this error:

You don’t have the required permissions (Microsoft.DataFactory/register/action) to create resources under the selected subscription.”

The Workaround Solution

Instead of giving my users co-admin on the subscription, I prefer a lesser impact approach.

The Azure portal website does not yet let you create custom roles (probably coming one day soon in the website), so instead we will use PowerShell to do so.

1. Install PowerShell for Azure

Installed the Azure SDK for PowerShell and run PowerShell ISE from the start menu. https://azure.microsoft.com/en-us/downloads/


Note, in Windows versions before Windows 10, you may need to Run as Administrator on this icon to get the Azure scripts to work.

2. Code my custom role

Using the code below, I created a custom role named Data Factory Creator by copying the existing role, and clearing out all the info. Inspiration came from these two reference https://msdn.microsoft.com/en-us/library/mt678996.aspx

https://azure.microsoft.com/en-gb/documentation/articles/role-based-access-control-manage-access-powershell/

The # Green lines with pound signs are comments for you to read, and are not run by PowerShell interpreter.
I recommend using PowerShell ISE because it lets you highlight each code block in the script window in the top half, and run that selection (F8) which is nice for piecemeal troubleshooting. The output from running the code is shown in the blue background section beneath the code.

You need to substitute in your actual subscription ID in the place of the zeros. 0000000-0000-0000-0000-00000000000

==============================

# Login prompt to authenticate your admin account to use Azure
Login-AzureRmAccount

# Copy existing Data Factory Contributor, clear actions and scopes, and add REGISTER action to the new role
$role  Get-AzureRmRoleDefinition  “Data Factory Contributor”
$role.Id = $null
$role.Name =“Data Factory Creator”
$role.Description = “Can create data factories.”
$role.Actions.Clear()
$role.Actions.Add(“Microsoft.DataFactory/register/action”)
$role.AssignableScopes.Clear()
                            # type your own subscription ID here
$role.AssignableScopes.Add(“/subscriptions/0000000-0000-0000-0000-00000000000)
New-AzureRmRoleDefinition -Role $role

# List Custom Roles to see if the Data Factory Creator Role worked
Get-AzureRmRoleDefinition FT Name, IsCustom

# List the details of the new role to make sure it matches expectations

Get-AzureRmRoleDefinition “Data Factory Creator”
(Get-AzureRmRoleDefinition “Data Factory Creator”).Actions

# Delete the custom role if needed
# Get-AzureRmRoleDefinition “Data Factory Creator/Contributor”| Remove-AzureRmRoleDefinition

==============================

3. Locate the subscription and add the users into the new custom role.

Add the required users to our new custom Data Factory Creator role at the subscription level.

Note, it is not sufficient to do this at the Resource Group level, since the action to create an object happens in the subscription level first.

Visit the Azure portal > Subscriptions > select the subscription > Settings (All settings) > Users > Roles > Data Factory Creator

Add the user > type their name to search the directories, then select the user in the list to add them.

4. Add users to the built-in Data Factory Contributor role as well

Add the required users to the built-in Data Factory Contributor at the Resource Group level if you want them to manage the Data Factories for a given resource group. If you want them to manage the data factories across the whole subscription, you can grant them role membership at the subscription level, but the resource group is required as well to create new data factories.

This screenshot is for the Resource Group level. You could do the same as #3 for the Data Factory Contributor role.

Resource groups > Pick the scope of Resource Group you want > Access Control (IAM) > Roles

Roles > Data Factory Contributor > Add > type the users name to search the directories, and select the user from the list.

5. Have the users test

Make sure any users F5 (refresh) their browser to get the latest security ACLs in the Azure Portal, and try to create a Data Factory now with limited access using the 2 role memberships mentioned above.

MSDN Blogs: 10 Azure Security Technologies You Want to Know About NOW

$
0
0

imageYou’ve got word that your group is tasked with moving to the cloud and one of the cloud providers you’ll be working with is Microsoft Azure.

Great!

Now you’re thinking “what about security?”

Good question.

Microsoft Azure has a ton of security built right into the platform. You can learn about Azure platform security (which you get with any service you run on Azure) by visiting the Microsoft Trust Center. There you’ll learn about what we do on at the Azure platform level to secure the services and information you place in Azure.

If you want to dive into Azure security technical details, architecture, and best practices, then head on over to the Azure Security Information site.

The next step is to start learning about the services, features and technologies that we make available to you that you can use to enhance the security of the services you build on Azure. There are a lot of them to choose from, so to help you get started, we’ve compiled a list of what we consider to be the “top 10” to start with. Over time, you’ll discover more – but make sure you know about these first!

1. Azure Security Center

Azure Security Center provides you a central location from which you can assign security policies, get security recommendations, and receive alerts and remediations for IaaS and PaaS assets you place in Azure. With Azure Security Center you’ll:

  • Have a better understanding of your overall security state
  • Be able to define security policies for your subscriptions and resource groups
  • Easily deploy integrated security partner solutions
  • Take advantage of advanced threat detection and quickly respond to threats

Azure Security Center is our elite security offering for protecting your Azure assets.

For more information Azure Security Center check out What is Azure Security Center.

2. OMS Security & Compliance

Operations Management Suite (OMS) Security and Compliance compliments and extends the advanced detection and alerting capabilities found in Azure Security Center by including your on-premises resources. Similar to Azure Security Center, OMS Security & Compliance helps you prevent, detect and respond to threats.

With OMS Security & Compliance you can:

  • Analyze and investigate incidents
  • Detect threats before they happen
  • Streamline security audits

For more information about OMS Security & Compliance, check out Operations Management Suite | Security & Compliance

3. Azure Key Vault

Azure Key Vault is your Hardware Security Module (HSM) in the cloud. You can use Azure Key Vault to store and encrypt keys and small secrets like passwords using keys stored in Azure Key Vault HSMs. You can also monitor and audit key and secret usage by taking advantage of Azure Logging – all you need to do is pipe your logs in Azure HDInsight or your on-premises (or cloud) Security Information and Event Management (SIEM) system and you can get even more information and insights about key use (and abuse).

For more information about Azure Key Vault, check out What is Azure Key Vault.

4. Azure Disk Encryption

Azure Disk Encryption lets you encrypt your Windows and Linux Azure Virtual Machine disks. Azure Disk Encryption uses industry standard BitLocker for Windows VMs and DM-Crypt for Linux VMs to provide volume encryption for the OS and the data disks. Azure Disk Encryption is integrated with Azure Key Vault to help you control and manage the disk encryption keys and secrets in your key vault subscription, while ensuring that all data in the virtual machine disks are encrypted at rest in your Azure storage. With Azure Disk Encryption, even if you’re virtual machine disks are stolen, the attacker will not be able to access the data on the encrypted disk.

For more information about Azure Disk Encryption, check out Azure Disk Encryption for Windows and Linux Azure Virtual Machines

5. Azure Storage Service Encryption

Azure Storage Service Encryption helps you protect data to meet organizational security and compliance commitments. With this feature, Azure Storage automatically encrypts your data prior to persisting to storage and decrypts it prior to retrieval. The encryption, decryption, and key management opaque to users so they never need to do anything to make it happen. Azure Storage Service Encryption enables you automatically encrypt block blobs, page blobs and append blobs.

For more information about Azure Storage Service Encryption, check out Azure Storage Service Encryption for Data at Rest (Preview).

6. Azure SQL Transparent Data Encryption

SQL Transparent Data Encryption helps protect your data in a scenario where the physical media (such as drives or backup tapes) are stolen so that a malicious party can just restore or attach the database and browse the data. One solution is to encrypt the sensitive data in the database and protect the keys that are used to encrypt the data with a certificate. This prevents anyone without the keys from using the data. TDE performs real-time I/O encryption and decryption of the data and log files for both SQL Server in Azure Virtual Machines as well as Azure SQL.

For more information about Azure SQL Transparent Data Encryption, check out Transparent Data Encryption.

7. Azure SQL Cell Level Encryption

Azure SQL Cell Level Encryption allows you to select the columns you want to encrypt in your database, which can be useful in some instances when you have very large databases.

For more information about Azure SQL Cell Level Encryption, check out Recommendations for using Cell Level Encryption in Azure SQL Database.

8. Azure Log Integration

Azure Log Integration enables you to integrate logs from both Azure and on-premises assets so that you can integrate them with your on-premises Security Information and Event Management System (SIEM).

For more information about Azure Log Integration, check out Microsoft Azure Log Integration – Preview

9. Azure Active Directory Multi-Factor Authentication

Azure Active Directory Multi-Factor Authentication (MFA) enables you to increase the security of your solutions hosted on Azure by bypassing the security issues related to traditional username/password solutions. There are a number of identity verification options, such as phone call, SMS message, or mobile app notification. The solution provides real time alerts and monitoring and can be deployed on-premises, in the cloud, or both.

For more information about Azure Active Directory Multi-Factor Authentication, check out What is Azure Multi-Factor Authentication.

10. Azure Active Directory Privileged Identity Management

Azure Active Directory Privileged Identity Management (PIM) enables you can manage, control, and monitor access within your organization. This includes access to resources in Azure AD and other Microsoft online services like Office 365 or Microsoft Intune.

Azure AD Privileged Identity Management helps you:

  • See which users are Azure AD administrators
  • Enable on-demand, “just in time” administrative access to Microsoft Online Services like Office 365 and Intune
  • Get reports about administrator access history and changes in administrator assignments
  • Get alerts about access to a privileged role

For more information about Azure Active Directory Privileged Identity Management, check out Azure AD Privileged Identity Management.

Thanks!

Tom

Tom Shinder
Program Manager, Azure Security
@tshinder | Facebook | LinkedIn | Email | Web | Bing me! | GOOG me!

image

MS Access Blog: Office 365 news roundup

$
0
0

Microsoft customers of every kind, from individuals to enterprises to educators, rely on Office 365 for exceptional productivity, privacy and security. Office 365 provides the tools to help Office 365 administrators of Fortune 500 companies, small businesses and universities work better, more efficiently and more easily.

We recently released the Service Assurance Dashboard as part of the Office 365 Security and Compliance Center, which enables administrators to perform a risk assessment on Office 365 services—on demand—and provides transparency into Office 365 services. We also announced an update to the Yammer apps for iOS and Android, which allows IT administrators to support their mobile workforce and provide employees with access to corporate resources on their personal mobile devices while protecting their corporate data using mobile application management (MAM) controls in Microsoft Intune.

As part of our ongoing work to enhance Office 365 productivity, we announced the general availability of the Microsoft Excel REST API for Office 365. Developers can now use Excel to power custom apps that extend the value of your data, calculations, reporting and dashboards. In addition, we recently released an Office Online extension that lets you create and access Office files directly from the Microsoft Edge browser. We also launched the OneNote Importer tool for Mac, which makes it easy for Mac users to move their Evernote pages to OneNote, allowing them to save money while taking advantage of the OneNote integration with Office apps like Word, PowerPoint and Excel. And for educators, we recently made improvements to OneNote Class Notebooks and OneNote Staff Notebooks based on teacher feedback.

Office 365 is a complete solution, providing the productivity, security and manageability you need.

Below is a roundup of some key news items from the last couple of weeks. Enjoy!

Democratizing data—Atkins goes digital by default with Office 365 E5—Learn how Office 365 is helping employees at Atkins, a design, engineering and project management consultancy, make faster, better decisions.

Carvajal switches to Office 365 for faster business, reduced costs—Find out how this diverse company with offices in 15 countries uses Office 365 to provide companywide collaboration, productivity and efficiency.

Microsoft’s Continuous Improvements to Office 365 Reporting—Learn more about the administrative reporting tools Microsoft provides with Office 365.

A no-frills look at Microsoft Office 2016 and Office 365 versions—Discover which Office plan is right for you and your business.

Microsoft expands and renews international certifications in seven countries—Find out more about the new and expanded cloud certifications that Microsoft recently achieved to ensure greater privacy, security and compliance.

The post Office 365 news roundup appeared first on Office Blogs.

MSDN Blogs: Windows Subsystem for Linux 概要

$
0
0

先日から自由研究テーマで何度かこのブログでも取り上げさせていただいた Bash on Ubuntu on Windows は Windows Subsystem for Linux (以下WSLと記載します。)の一機能として提供されている distro agnostic (うまい訳語がないのですが、システムに対してリスクや競合がない状態でインストール可能という感じの意味です)なLinux 互換のサブシステムです。

 

WSLの仕組み

WSL は Windows カーネルの新機能で、Anniversary Update でもプレビュー扱いの機能です。インパクトの強さからか、Bash on Ubuntu on Windows ばかりがフォーカスされがちですが、WSL のインターフェイスとして Bash on Ubuntu on Windows  が利用できるというイメージのほうが適格です。

WSL の仕組みは下図のように Linux 互換のシステムコールがユーザーモードで動作している Ubuntu 上の Bash や GC++、Ruby などのエンドポイントからアクセスできるというものです。fork() をはじめとする多数のLinux のシステムコールをサポートしています。(ここでいうサポートは機能としてついているというもので、製品機能としてはプレビューです。ややこしくてすみません。)

2016-08-20_08h59_27

 

 

できることとできないこと

まず、最初にご理解いただきたいのは、この機能はあくまでベータの機能なので、商用環境での使用はご遠慮ください。できることは多数ありますが、実際に動かしてみたものとしては Bash apt-get git Ruby node GNU C/C++, CoreCLR などです。apt-get もあまりにも大きなバイナリだと失敗することもあります。すべてのシナリオに対してテストを行うのは困難なので、一般的な機能に限られています。

 

お願い事項

機能追加やバグ、要望があれば、下記リンクから開発チームへフィードバックもしくはすでにある意見に対してVote してください。

 UserVoice (機能追加、改善要望)

https://wpdev.uservoice.com/forums/266908-command-prompt-console-bash-on-ubuntu-on-windo

 GitHub (課題)

https://aka.ms/winbashgithub

 

参考リンク

Windows Subsystem for Linux Overview

https://blogs.msdn.microsoft.com/wsl/2016/04/22/windows-subsystem-for-linux-overview/

 

Console & Command-line

https://blogs.msdn.Microsoft.com/commandline

 

本情報の内容(添付文書、リンク先などを含む)は、作成日時点でのものであり、予告なく変更される場合があります。

MSDN Blogs: PowerShell on Linux (Ubuntu16.4) のインストール方法

$
0
0

先日、PowerShell がオープンソース化され、Linux や OSX でも利用可能になりました。ただ、使えるようになったということではなく、その先には、 Microsoft loves Linux のきちんとしたビジョンとメッセージがあるのですが、詳細はリンク先をご覧いただくとして、SQL Server on Linuxのアナウンス(製品はまだプライベートベータですので、内容については今日現在は言及できませんが、SQL Server 2016 の機能エンハンスから期待してもいいと思います。)、マルチプラットフォームでオープンソースな.NET Core やWSLとしてBashがWindows 10 に加わり、オープンソースソフトウェアの開発のためのプラットフォームを整備してきました。将来的には Microsoft Operations Management Suite (OMS)への機能拡張も視野にはいっています。個人的には、PowerShell のDesired State Configuration (DSC) で扱える範囲がさらに広がることによってマルチプラットフォームでのインフラストラクチャの運用管理がさらに自動化・効率化され、DevOpsに代表される作って捨てて、作って捨ててっていうインフラのライフサイクルをさらに加速するのではないかと期待しています。

さて、そんなオープンソースになったPowerShellですが、手元にUbuntu 16.04 (64-bit) の環境があったので、実際にどのようにインストールするのかを実施してみましたので手順について書いてみます。

 

PowerShell on Linux のインストール

前提環境:

Welcome to Ubuntu 16.04.1 LTS (GNU/Linux 4.4.0-34-generic x86_64)

* Documentation:  https://help.ubuntu.com
 * Management:     https://landscape.canonical.com
 * Support:        https://ubuntu.com/advantage

Get cloud support with Ubuntu Advantage Cloud Guest:
 http://www.ubuntu.com/business/services/cloud

14 packages can be updated.
 0 updates are security updates.
 Last login: Sat Aug 20 01:43:00 2016 from 157.65.54.111

作業用フォルダの作成、移動

miyamam@SQLinux:~$ pwd 
 /home/miyamam
 miyamam@SQLinux:~$ mkdir temp
 miyamam@SQLinux:~$ cd temp/

wget コマンドでインストールバイナリをダウンロードする。

 miyamam@SQLinux:~/temp$ wget https://github.com/PowerShell/PowerShell/releases/download/v6.0.0-alpha.9/powershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb
 --2016-08-20 01:44:41--  https://github.com/PowerShell/PowerShell/releases/download/v6.0.0-alpha.9/powershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb
 Resolving github.com (github.com)... 192.30.253.112
 Connecting to github.com (github.com)|192.30.253.112|:443... connected.
 HTTP request sent, awaiting response... 302 Found
 Location: https://github-cloud.s3.amazonaws.com/releases/49609581/3ab34990-63bf-11e6-84b3-9c3f34c3318d.deb?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAISTNZFOVBIJMK3TQ%2F20160820%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20160820T014442Z&X-Amz-Expires=300&X-Amz-Signature=9eee30f12bd1767fe0200f43acac93214a70e7aaa7835e97abc4eb393487be6d&X-Amz-SignedHeaders=host&actor_id=0&response-content-disposition=attachment%3B%20filename%3Dpowershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb&response-content-type=application%2Foctet-stream [following]
 --2016-08-20 01:44:42--  https://github-cloud.s3.amazonaws.com/releases/49609581/3ab34990-63bf-11e6-84b3-9c3f34c3318d.deb?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAISTNZFOVBIJMK3TQ%2F20160820%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20160820T014442Z&X-Amz-Expires=300&X-Amz-Signature=9eee30f12bd1767fe0200f43acac93214a70e7aaa7835e97abc4eb393487be6d&X-Amz-SignedHeaders=host&actor_id=0&response-content-disposition=attachment%3B%20filename%3Dpowershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb&response-content-type=application%2Foctet-stream
 Resolving github-cloud.s3.amazonaws.com (github-cloud.s3.amazonaws.com)... 54.231.72.219
 Connecting to github-cloud.s3.amazonaws.com (github-cloud.s3.amazonaws.com)|54.231.72.219|:443... connected.
 HTTP request sent, awaiting response... 200 OK
 Length: 40928824 (39M) [application/octet-stream]
 Saving to: ‘powershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb’

powershell_6.0.0-alpha.9-1ubu 100%[=================================================>]  39.03M  12.1MB/s    in 4.3s

2016-08-20 01:44:48 (9.03 MB/s) - ‘powershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb’ saved [40928824/40928824]


#wget する先のリンクは都度新しいものがリリースされている可能性がありますので、GitHubで確認をして環境に合わせます

2016-08-20_11h39_05

 

インストール

miyamam@SQLinux:~/temp$ sudo apt-get install libunwind8 libicu55
Reading package lists... Done
Building dependency tree
Reading state information... Done
libicu55 is already the newest version (55.1-7).
libunwind8 is already the newest version (1.1-4.1).
0 upgraded, 0 newly installed, 0 to remove and 18 not upgraded.
miyamam@SQLinux:~/temp$ sudo dpkg -i powershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb
Selecting previously unselected package powershell.
(Reading database ... 99535 files and directories currently installed.)
Preparing to unpack powershell_6.0.0-alpha.9-1ubuntu1.16.04.1_amd64.deb ...
Unpacking powershell (6.0.0-alpha.9-1) ...
Setting up powershell (6.0.0-alpha.9-1) ...
Processing triggers for man-db (2.7.5-1) ...

 

起動確認とHello PowerShell

miyamam@SQLinux:~/temp$ powershell
 PowerShell
 Copyright (C) 2016 Microsoft Corporation. All rights reserved.

PS /home/miyamam/temp> $PSVersionTable
 Name                           Value
 ----                           -----
 PSVersion                      6.0.0-alpha
 PSEdition                      Core
 PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0...}
 BuildVersion                   3.0.0.0
 GitCommitId                    v6.0.0-alpha.9
 CLRVersion
 WSManStackVersion              3.0
 PSRemotingProtocolVersion      2.3
 SerializationVersion           1.1.0.1

 

PowerShellの典型的なサンプルのWrite-Host コマンドレットを使用します。(-ForegroundColor オプションは指定色にするというオプションです。)

 

PS /home/miyamam/temp> Write-Host "Hello PowerShell!!" -ForegroundColor CyanHello PowerShell!!
 PS /home/miyamam/temp> Write-Host "Hello PowerShell!!" -ForegroundColor MagentaHello PowerShell!!

2016-08-20_11h46_14

 

インタラクティブシェルの動作確認として、Windows の PowerShell 環境ではエラーになる ifconfigですが…

2016-08-20_11h52_42

ちゃんと動作します。

2016-08-20_11h52_56

終了させる場合はexit コマンドで終了させます。

 

参考リンク

PowerShell is open sourced and is available on Linux

https://azure.microsoft.com/en-us/blog/powershell-is-open-sourced-and-is-available-on-linux/

 

Package installation instructions

https://github.com/PowerShell/PowerShell/blob/master/docs/installation/linux.md

 

 

本情報の内容(添付文書、リンク先などを含む)は、作成日時点でのものであり、予告なく変更される場合があります。

MSDN Blogs: NEXT UP — Get a leg up on your future with a Microsoft Office 365 Certification

$
0
0

NEXT UPEarning any kind of specialist certification is a great way to stand out from the crowd, whether you’re looking for a new challenge, a new job, or a way to make yourself more valuable to your current employer. With the growing importance of the cloud, Microsoft Office 365 is a must-have certification for anyone looking to prove their skills.

You’re probably familiar with all the great things Microsoft Office 365 brings to a business, a software suite that matches a business needs anytime, on any device. You’ve worked with those solutions and have earned technical skills and knowledge.

Now it’s time to validate that experience.

Here’s what it takes to become a Certified Microsoft Office 365 Specialist: schedule, prepare for and pass one of these two exams:

Exam 70-346: Managing Office 365 Identities and Requirements — is for you if you take part in evaluating, planning, deploying, and operating the Office 365 services, including its dependencies, requirements, and supporting technologies.

To get certified by passing this exam, you’ll need to already be familiar with the features and capabilities of Office 365, as well as strong skills around managing cloud identities with Microsoft PowerShell.

This exam tests your knowledge of how to:

  • Provision Office 365
  • Plan and implement networking and security in Office 365
  • Manage Cloud Identities
  • Implement and manage identities by Azure AD Connect
  • Implement and manage federated identities for single sign on
  • Monitor and troubleshoot Office 365 availability and usage

Exam 70-347: Enabling Office 365 Services— is for people who take part in evaluating, planning, deploying, and operating the Office 365 services, including its dependencies, requirements, and supporting technologies.

To get certified by passing this exam, you should have experience with the Office 365 Admin Center and an understanding of Microsoft Exchange Online, Skype for Business Online, SharePoint Online, Office 365 ProPlus, and Microsoft Azure Active Directory.

In order to pass, you’ll need to know how to:

  • Manage clients and end-user devices
  • Provision SharePoint Online site collections
  • Configure Exchange Online and Skype for Business Online for end users
  • Plan for Exchange Online and Skype for Business Online

Next Up is our, five-week three-stage exam preparation camp created by Microsoft Certified Trainers to give you the edge you need and fast track your way to be a Microsoft Azure Certified Specialist.

STAGE 1: Five weeks of guided self-study with the support of skilled Microsoft Certified Trainers via Yammer.

STAGE 2: In Person Exam Preparation ½ day, which is designed to get you in shape to sit the exam, you will have access to skilled experts to help you answer those tough questions.

STAGE 3: In Person Exam Preparation ½ day, it is time to put all your hard work aside and to SIT THE EXAM and become a Microsoft Office 365 Certified Specialist.

The program kicks off the final week of September register today and get ready for your 5-weeks of virtual study prior to your in person class.

CourseSydney Microsoft OfficeMelbourne CliftonsBrisbane CliftonsAdelaide CliftonsPerth Cliftons
Exam 70-346: Managing Office 365 Identities and RequirementsTuesday November 1

In Person Exam-Prep

1:00pm – 5:00pm

Sit the Exam

5.00pm – 7.00pm

Monday November 21

In Person Exam-Prep

1:00pm – 5:00pm

Sit the Exam

5.00pm – 7.00pm

Thursday November 24

In Person Exam-Prep

1:00pm – 5:00pm

Sit the Exam

5.00pm – 7.00pm

Monday November 28

In Person Exam-Prep

1:00pm – 5:00pm

Sit the Exam

5.00pm – 7.00pm

Thursday December 1

In Person Exam-Prep

1:00pm – 5:00pm

Sit the Exam

5.00pm – 7.00pm

Exam 70-347: Enabling Office 365 ServicesWednesday November 2

In Person Exam-Prep

08:30am – 12:30pm

Sit the Exam

1:00pm – 3:00pm

Tuesday November 22

In Person Exam-Prep

08:30am – 12:30pm

Sit the Exam

1:00pm – 3:00pm

Friday November 25

In Person Exam-Prep

08:30am – 12:30pm

Sit the Exam

1:00pm – 3:00pm

Tuesday November 29

In Person Exam-Prep

08:30am – 12:30pm

Sit the Exam

1:00pm – 3:00pm

Friday December 2

In Person Exam-Prep

08:30am – 12:30pm

Sit the Exam

1:00pm – 3:00pm

MSDN Blogs: Console Ouptut: my New Debugging and Testing tool for Windows

$
0
0

I have been working with a partner who wanted to get standard console output from a UWP app for debugging and testing purposes.  Of course the easiest way get debugging output is by attaching a debugger like Visual Studio but in some cases that isn’t possible or practical.  I just published an app, Console Output that enables other apps to send output text lines to it. There are some new capabilities in Windows 10 Anniversary Update that makes this possible:

  1. App Services that can run in the application’s process instead of in a background task 
  2. Remote Systems API that enables connecting apps across devices and platforms.

Destktop

On its own, Console Output doesn’t do anything special when you run it, but if you launch it from another app by activating the consoleoutput: protocol handler consoleoutput:?title=My app name you can then send text output to it with app services.

When you install and run Console Output, you will see a full code sample in C# showing how to launch Console Output with the Launcher.LaunchUriAsync() API and then open an AppServiceConnection communications channel for bi-directional communications with Console Output.  The sample code is at the bottom of this article.  Console Output is also enabled for remote activation so doing debugging and testing of devices nearby or through the cloud becomes possible as well.  I have also shared the source code for a Console Output Tester app that demonstrates how an app might use Console Output.

Sample C# Code

using System;
using Windows.ApplicationModel.AppService;
using Windows.Foundation.Collections;
using Windows.System;

public class ConsolLoggingTester : IDisposable
{
    private AppServiceConnection _appServiceConnection;
   
    public ConsolLoggingTester()
    {
        InitializeAppServiceConnection();
    }

    private async void InitializeAppServiceConnection()
    {
        // this is the unique package family name of the Console Ouput app
        const string consoleOutputPackageFamilyName = “49752MichaelS.Scherotter.ConsoleOutput_9eg5g21zq32qm”;

        var options = new LauncherOptions
        {
            PreferredApplicationDisplayName = “Console Output”,
            PreferredApplicationPackageFamilyName = consoleOutputPackageFamilyName,
            TargetApplicationPackageFamilyName = consoleOutputPackageFamilyName,
        };

        // launch the ConsoleOutput app
        var uri = new Uri(“consoleoutput:?Title=Console Output Tester&input=true”);

        if (!await Launcher.LaunchUriAsync(uri, options))
        {
            return;
        }

        var appServiceConnection = new AppServiceConnection
        {
            AppServiceName =  “consoleoutput”,
            PackageFamilyName = consoleOutputPackageFamilyName
        };

        var status = await appServiceConnection.OpenAsync();

        if (status == AppServiceConnectionStatus.Success)
        {
            _appServiceConnection = appServiceConnection;
   
            // because we want to get messages back from the console, we
            // launched the app with the input=true parameter
            _appServiceConnection.RequestReceived += OnRequestReceived;
        }
    }

    public async void LogMessage(string messageText)
    {
        if (_appServiceConnection == null)
        {
            return;
        }

        var message = new ValueSet
        {
            [“Message”] = messageText
        };

        await _appServiceConnection.SendMessageAsync(message);
    }

    public void Dispose()
    {
        if (_appServiceConnection != null)
        {
            _appServiceConnection.Dispose();
            _appServiceConnection = null;
        }
    }

    private void OnRequestReceived(
        AppServiceConnection sender,
        AppServiceRequestReceivedEventArgs args)
    {
        var message = args.Request.Message[“Message”] as string;

        // handle message input from Console Ouptut app
    }
}

Links

Please tell me if you start using Console Output and if you find it useful!


MSDN Blogs: Node “Console App” & Debugging TypeScript in VS Code

$
0
0

One of the most common utility cases when working with SharePoint is to create a new console app, import the PnP CSOM components, and then perform the actions required. This is a way to test code, make quick one-off updates, or process many sites for a common operation. As we have developed the Patterns and Practices client library I’ve been looking forward to an opportunity to do something similar directly in node. Once we added node request support in 1.0.3 this became much closer to reality. The next step was setting up a project to run an arbitrary TypeScript program and enable debugging, a process outlined in this post.

Step 1 – Setup the Project

As simple as it sounds the first step is setting up our TypeScript project. You can always add any of the libraries you like, but this will get us started for the example. We also then install the typings we want to give us better type checking and intellisense. Once these setup steps are done you can re-use the application by just updating the source code.

npm init
npm install gulp gulp-sourcemaps gulp-typescript node-fetch sp-pnp-js typescript typings --save-dev
typings install dt~es6-promise dt~microsoft.ajax dt~promises-a-plus dt~sharepoint dt~whatwg-fetch --global --save

Next create a tsconfig.json file in the root of the project and add the below content and save.

{"compilerOptions": {"target": "es5","module": "commonjs","jsx": "react","declaration": false,"sourceMap": true,"removeComments": true,"sortOutput": true
 }
}

Finally we need a gulp file to build and run our program. Create a file named gulpfile.js in the root directory and add the below content.

var gulp = require("gulp"),
 tsc = require("gulp-typescript"),
 maps = require('gulp-sourcemaps'),
 exec = require("child_process").exec;

gulp.task("build", function () {

 var src = ["src/**/*.ts", "typings/index.d.ts"];

 var project = tsc.createProject("tsconfig.json");
 return gulp.src(src)
   .pipe(maps.init())
   .pipe(tsc(project)).js
   .pipe(maps.write(".", { sourceRoot: "../src" }))
   .pipe(gulp.dest("build"));
});

gulp.task("run", ["build"], function (cb) {

 exec('node build/main.js', function (err, stdout, stderr) {
   console.log(stdout);
   console.log(stderr);
   cb(err);
 });
});

We are first requiring the four libraries we need: gulp, the task runner; gulp-typescript, pipes the TypeScript compiler in gulp; gulp-sourcemaps, captures and writes the source maps we generate during build; and child_process.exec, which is the way we can start process in node. Then we define two gulp tasks, “build” and “run”. The build task transpiles our TypeScript files into JavaScript and writes out the source maps – these maps will be critical once we begin debugging. Also note we are including the typings index file so our global references are defined during build, this is a common source of unexpected build errors related to references not being defined. Finally we build all the files in the “src” folder and write the results to the “build” folder – these are just names and you can change them to suit your preference.

The run task is what actually takes our built output and runs it. It does so by invoking exec on the main.js file found in the build folder (we’ll create a main.ts source file shortly). The exec operation starts a new program in node from the file specified in the path supplied as the first argument. Once we have our program ready we will be able to type gulp run and execute our code.

Step 2 – Write some Code!

Now that we setup our project we can actually do something. Start by creating a “src” folder in your project and adding a main.ts file to that folder. Then add the below contents to main.ts:

console.log("Hello world!");

Now in your console type the command “gulp run” and you should see your project get built and “Hello World!” written out to the console. Awesome! Now we can get down to business. Update your main.ts to include the below:

import pnp from "sp-pnp-js";

pnp.setup({
 nodeClientOptions: {
 clientId: "4d4d3329-1d3d-425c-93fa-e4ae393f8cbb",
 clientSecret: "F9iUC6B4LM7TClLWY5aixJEmGDGpvGsXD3lifX7ogts=",
 siteUrl: "https://{your tenant}.sharepoint.com/sites/dev"
 }
});

pnp.sp.web.select("Title").get().then((r) => console.log(JSON.stringify(r)));

You will need to update the clientId, clientSecret and siteUrl values once you register the add-in permissions in your site. If you have properly registered the permissions you can once again use the “gulp run” command and you should see the title of your web written to the console. You can then begin using your app to perform any tasks you want in your site. Remember you need to ensure you’ve requested the appropriate permissions on /_layouts/appinv.aspx. One little note, batching is currently broken in node – this has been fixed and will be included in 1.0.4 release coming up soon.

Step 3 – Debugging

Once we can run code it would be fantastic if we could debug it to see what is going on. In Visual Studio Code this means adding a launch.json file. First create a folder named “.vscode” in the root of the project and add a file named “launch.json” with the contents:

{"version": "0.2.0","configurations": [
   {"name": "Launch","type": "node","request": "launch","program": "${workspaceRoot}/src/main.ts","stopOnEntry": false,"args": [],"cwd": "${workspaceRoot}","preLaunchTask": "build","runtimeExecutable": null,"runtimeArgs": ["--nolazy"
     ],"env": {"NODE_ENV": "development"
     },"externalConsole": false,"sourceMaps": true,"outDir": "${workspaceRoot}/build"
   }
 ]
}

This file will instruct Visual Studio Code how to launch when you hit F5. In your main.ts file set a break point and hit F5. You should see the breakpoint hit and you can examine locals and add watches on the debug tab accessed on the left hand navigation bar. If your breakpoint is not hit, try restarting Visual Studio Code.

Gotcha: If you continue to have issues hitting your break point or are getting the message “Breakpoint ignored because generated code not found (source map problem?)” – ensure you have correctly set the “sourceRoot” property of the source maps write call found in the gulpfile.js’s build task. This needs to be a relative path from your built output to your source files.

Now that you can debug your code and use the Patterns and Practices Core Library from a console app you can rapidly develop and test your code – or perform quick queries and updates to your SharePoint sites. Happy coding!

Download the Sample Project

You can download the starter project as well: NodeConsoleApp. You will need to run the following commands to load the dependencies:

npm install
typings install

What is the JS Core Component?

The Patterns and Practices JavaScript Core Library was created to help developers by simplifying common operations within SharePoint. This is aligned with helping folks transitioning into client side development in support of the upcoming SharePoint Framework. Currently it contains a fluent API for working with the full SharePoint REST API as well as utility and helper functions. This takes the guess work out of creating REST requests, letting developers focus on the what and less on the how.

“Sharing is Caring”

MSDN Blogs: MPP, DWU and PolyBase in Azure SQL Data warehouse

$
0
0

Understanding some basic concepts in Azure SQL data warehouse can accelerate getting good grips on the functionality it offers.  Discussed below are some key terms.

MPP: Unlike the previous incarnation of on-premise SQL Server Data warehouse which uses SMP (Symmetric Multi Processing), Azure SQL Data warehouse is designed using MPP (Massively Parallel Processing).  In essence this design co-ordinates processing 1 task by multiple logical processors. Each logical processor consists of its own CPU and memory. Processors communicate with each other.

These processors can also be referred as nodes. Nodes can be classified into following 2 types.

Control node: Creates parallel query plan, co-ordinates query execution, data aggregation, etc.
Compute node: Nodes that do actual computation.

MPP has 2 design models –

  1. Share Nothing
  2. Shared Disk

In Share Nothing model,  used by Azure SQL Data warehouse, each node is independent and has its own data. This data is subset of rows of a table in Data warehouse. This enables massive scalability. Speaking of scalability, Azure SQL Data warehouse expresses it in terms of Data Warehouse Unit (DWU).

DWU: In essence, DWU is a function of memory, CPU and concurrency. Basic DWU, DW100 can have upto 24GB of RAM with lesser concurrency
1 DWU is approximately 7.5 DTU (Database Throughput Unit, used to express the horse power of an OLTP Azure SQL Database) in capacity although they are not exactly comparable.

 

Another important concept is the technology called as PolyBase.

PolyBase: It provides a scalable, T-SQL compatible query processing framework for combining data from RDBMS and Hadoop (or Azure Blob Storage) . It abstracts away many of the MapReduce/Hadoop technologies for SQL developers which are comfortable in the realms of  T-SQL.

As its name implies, it enables to query from or store at multiple (poly) places (base).  Its the actual engine that parallelize the query. While internally it does a lot of optimization, developers can use DMV (Data Management View) to monitor/query plan.

MSDN Blogs: test

$
0
0

/* GitHub stylesheet for MarkdownPad (http://markdownpad.com) */
/* Author: Nicolas Hery – http://nicolashery.com */
/* Version: b13fe65ca28d2e568c6ed5d7f06581183df8f2ff */
/* Source: https://github.com/nicolahery/markdownpad-github */

/* RESET
=============================================================================*/

html, body, div, span, applet, object, iframe, h1, h2, h3, h4, h5, h6, p, blockquote, pre, a, abbr, acronym, address, big, cite, code, del, dfn, em, img, ins, kbd, q, s, samp, small, strike, strong, sub, sup, tt, var, b, u, i, center, dl, dt, dd, ol, ul, li, fieldset, form, label, legend, table, caption, tbody, tfoot, thead, tr, th, td, article, aside, canvas, details, embed, figure, figcaption, footer, header, hgroup, menu, nav, output, ruby, section, summary, time, mark, audio, video {
margin: 0;
padding: 0;
border: 0;
}

/* BODY
=============================================================================*/

body {
font-family: Helvetica, arial, freesans, clean, sans-serif;
font-size: 14px;
line-height: 1.6;
color: #333;
background-color: #fff;
padding: 20px;
max-width: 960px;
margin: 0 auto;
}

body>*:first-child {
margin-top: 0 !important;
}

body>*:last-child {
margin-bottom: 0 !important;
}

/* BLOCKS
=============================================================================*/

p, blockquote, ul, ol, dl, table, pre {
margin: 15px 0;
}

/* HEADERS
=============================================================================*/

h1, h2, h3, h4, h5, h6 {
margin: 20px 0 10px;
padding: 0;
font-weight: bold;
-webkit-font-smoothing: antialiased;
}

h1 tt, h1 code, h2 tt, h2 code, h3 tt, h3 code, h4 tt, h4 code, h5 tt, h5 code, h6 tt, h6 code {
font-size: inherit;
}

h1 {
font-size: 28px;
color: #000;
}

h2 {
font-size: 24px;
border-bottom: 1px solid #ccc;
color: #000;
}

h3 {
font-size: 18px;
}

h4 {
font-size: 16px;
}

h5 {
font-size: 14px;
}

h6 {
color: #777;
font-size: 14px;
}

body>h2:first-child, body>h1:first-child, body>h1:first-child+h2, body>h3:first-child, body>h4:first-child, body>h5:first-child, body>h6:first-child {
margin-top: 0;
padding-top: 0;
}

a:first-child h1, a:first-child h2, a:first-child h3, a:first-child h4, a:first-child h5, a:first-child h6 {
margin-top: 0;
padding-top: 0;
}

h1+p, h2+p, h3+p, h4+p, h5+p, h6+p {
margin-top: 10px;
}

/* LINKS
=============================================================================*/

a {
color: #4183C4;
text-decoration: none;
}

a:hover {
text-decoration: underline;
}

/* LISTS
=============================================================================*/

ul, ol {
padding-left: 30px;
}

ul li > :first-child,
ol li > :first-child,
ul li ul:first-of-type,
ol li ol:first-of-type,
ul li ol:first-of-type,
ol li ul:first-of-type {
margin-top: 0px;
}

ul ul, ul ol, ol ol, ol ul {
margin-bottom: 0;
}

dl {
padding: 0;
}

dl dt {
font-size: 14px;
font-weight: bold;
font-style: italic;
padding: 0;
margin: 15px 0 5px;
}

dl dt:first-child {
padding: 0;
}

dl dt>:first-child {
margin-top: 0px;
}

dl dt>:last-child {
margin-bottom: 0px;
}

dl dd {
margin: 0 0 15px;
padding: 0 15px;
}

dl dd>:first-child {
margin-top: 0px;
}

dl dd>:last-child {
margin-bottom: 0px;
}

/* CODE
=============================================================================*/

pre, code, tt {
font-size: 12px;
font-family: Consolas, “Liberation Mono”, Courier, monospace;
}

code, tt {
margin: 0 0px;
padding: 0px 0px;
white-space: nowrap;
border: 1px solid #eaeaea;
background-color: #f8f8f8;
border-radius: 3px;
}

pre>code {
margin: 0;
padding: 0;
white-space: pre;
border: none;
background: transparent;
}

pre {
background-color: #f8f8f8;
border: 1px solid #ccc;
font-size: 13px;
line-height: 19px;
overflow: auto;
padding: 6px 10px;
border-radius: 3px;
}

pre code, pre tt {
background-color: transparent;
border: none;
}

kbd {
-moz-border-bottom-colors: none;
-moz-border-left-colors: none;
-moz-border-right-colors: none;
-moz-border-top-colors: none;
background-color: #DDDDDD;
background-image: linear-gradient(#F1F1F1, #DDDDDD);
background-repeat: repeat-x;
border-color: #DDDDDD #CCCCCC #CCCCCC #DDDDDD;
border-image: none;
border-radius: 2px 2px 2px 2px;
border-style: solid;
border-width: 1px;
font-family: “Helvetica Neue”,Helvetica,Arial,sans-serif;
line-height: 10px;
padding: 1px 4px;
}

/* QUOTES
=============================================================================*/

blockquote {
border-left: 4px solid #DDD;
padding: 0 15px;
color: #777;
}

blockquote>:first-child {
margin-top: 0px;
}

blockquote>:last-child {
margin-bottom: 0px;
}

/* HORIZONTAL RULES
=============================================================================*/

hr {
clear: both;
margin: 15px 0;
height: 0px;
overflow: hidden;
border: none;
background: transparent;
border-bottom: 4px solid #ddd;
padding: 0;
}

/* TABLES
=============================================================================*/

table th {
font-weight: bold;
}

table th, table td {
border: 1px solid #ccc;
padding: 6px 13px;
}

table tr {
border-top: 1px solid #ccc;
background-color: #fff;
}

table tr:nth-child(2n) {
background-color: #f8f8f8;
}

/* IMAGES
=============================================================================*/

img {
max-width: 100%
}

Test blog

This is a test

An outline:

  • Just testing that I can use github markup

MSDN Blogs: Bruno test

$
0
0

/* GitHub stylesheet for MarkdownPad (http://markdownpad.com) */
/* Author: Nicolas Hery – http://nicolashery.com */
/* Version: b13fe65ca28d2e568c6ed5d7f06581183df8f2ff */
/* Source: https://github.com/nicolahery/markdownpad-github */

/* RESET
=============================================================================*/

html, body, div, span, applet, object, iframe, h1, h2, h3, h4, h5, h6, p, blockquote, pre, a, abbr, acronym, address, big, cite, code, del, dfn, em, img, ins, kbd, q, s, samp, small, strike, strong, sub, sup, tt, var, b, u, i, center, dl, dt, dd, ol, ul, li, fieldset, form, label, legend, table, caption, tbody, tfoot, thead, tr, th, td, article, aside, canvas, details, embed, figure, figcaption, footer, header, hgroup, menu, nav, output, ruby, section, summary, time, mark, audio, video {
margin: 0;
padding: 0;
border: 0;
}

/* BODY
=============================================================================*/

body {
font-family: Helvetica, arial, freesans, clean, sans-serif;
font-size: 14px;
line-height: 1.6;
color: #333;
background-color: #fff;
padding: 20px;
max-width: 960px;
margin: 0 auto;
}

body>*:first-child {
margin-top: 0 !important;
}

body>*:last-child {
margin-bottom: 0 !important;
}

/* BLOCKS
=============================================================================*/

p, blockquote, ul, ol, dl, table, pre {
margin: 15px 0;
}

/* HEADERS
=============================================================================*/

h1, h2, h3, h4, h5, h6 {
margin: 20px 0 10px;
padding: 0;
font-weight: bold;
-webkit-font-smoothing: antialiased;
}

h1 tt, h1 code, h2 tt, h2 code, h3 tt, h3 code, h4 tt, h4 code, h5 tt, h5 code, h6 tt, h6 code {
font-size: inherit;
}

h1 {
font-size: 28px;
color: #000;
}

h2 {
font-size: 24px;
border-bottom: 1px solid #ccc;
color: #000;
}

h3 {
font-size: 18px;
}

h4 {
font-size: 16px;
}

h5 {
font-size: 14px;
}

h6 {
color: #777;
font-size: 14px;
}

body>h2:first-child, body>h1:first-child, body>h1:first-child+h2, body>h3:first-child, body>h4:first-child, body>h5:first-child, body>h6:first-child {
margin-top: 0;
padding-top: 0;
}

a:first-child h1, a:first-child h2, a:first-child h3, a:first-child h4, a:first-child h5, a:first-child h6 {
margin-top: 0;
padding-top: 0;
}

h1+p, h2+p, h3+p, h4+p, h5+p, h6+p {
margin-top: 10px;
}

/* LINKS
=============================================================================*/

a {
color: #4183C4;
text-decoration: none;
}

a:hover {
text-decoration: underline;
}

/* LISTS
=============================================================================*/

ul, ol {
padding-left: 30px;
}

ul li > :first-child,
ol li > :first-child,
ul li ul:first-of-type,
ol li ol:first-of-type,
ul li ol:first-of-type,
ol li ul:first-of-type {
margin-top: 0px;
}

ul ul, ul ol, ol ol, ol ul {
margin-bottom: 0;
}

dl {
padding: 0;
}

dl dt {
font-size: 14px;
font-weight: bold;
font-style: italic;
padding: 0;
margin: 15px 0 5px;
}

dl dt:first-child {
padding: 0;
}

dl dt>:first-child {
margin-top: 0px;
}

dl dt>:last-child {
margin-bottom: 0px;
}

dl dd {
margin: 0 0 15px;
padding: 0 15px;
}

dl dd>:first-child {
margin-top: 0px;
}

dl dd>:last-child {
margin-bottom: 0px;
}

/* CODE
=============================================================================*/

pre, code, tt {
font-size: 12px;
font-family: Consolas, “Liberation Mono”, Courier, monospace;
}

code, tt {
margin: 0 0px;
padding: 0px 0px;
white-space: nowrap;
border: 1px solid #eaeaea;
background-color: #f8f8f8;
border-radius: 3px;
}

pre>code {
margin: 0;
padding: 0;
white-space: pre;
border: none;
background: transparent;
}

pre {
background-color: #f8f8f8;
border: 1px solid #ccc;
font-size: 13px;
line-height: 19px;
overflow: auto;
padding: 6px 10px;
border-radius: 3px;
}

pre code, pre tt {
background-color: transparent;
border: none;
}

kbd {
-moz-border-bottom-colors: none;
-moz-border-left-colors: none;
-moz-border-right-colors: none;
-moz-border-top-colors: none;
background-color: #DDDDDD;
background-image: linear-gradient(#F1F1F1, #DDDDDD);
background-repeat: repeat-x;
border-color: #DDDDDD #CCCCCC #CCCCCC #DDDDDD;
border-image: none;
border-radius: 2px 2px 2px 2px;
border-style: solid;
border-width: 1px;
font-family: “Helvetica Neue”,Helvetica,Arial,sans-serif;
line-height: 10px;
padding: 1px 4px;
}

/* QUOTES
=============================================================================*/

blockquote {
border-left: 4px solid #DDD;
padding: 0 15px;
color: #777;
}

blockquote>:first-child {
margin-top: 0px;
}

blockquote>:last-child {
margin-bottom: 0px;
}

/* HORIZONTAL RULES
=============================================================================*/

hr {
clear: both;
margin: 15px 0;
height: 0px;
overflow: hidden;
border: none;
background: transparent;
border-bottom: 4px solid #ddd;
padding: 0;
}

/* TABLES
=============================================================================*/

table th {
font-weight: bold;
}

table th, table td {
border: 1px solid #ccc;
padding: 6px 13px;
}

table tr {
border-top: 1px solid #ccc;
background-color: #fff;
}

table tr:nth-child(2n) {
background-color: #f8f8f8;
}

/* IMAGES
=============================================================================*/

img {
max-width: 100%
}

Test blog

This is a test

An outline:

  • Just testing that I can use github markup
  • Just testing that I can use github markup

MSDN Blogs: test draft

Viewing all 3015 articles
Browse latest View live