Quantcast
Channel: Randy Riness @ SPSCC aggregator
Viewing all 3015 articles
Browse latest View live

MSDN Blogs: Early technical preview of Microsoft Drivers 4.1.0 for PHP on Windows for SQL Server Released!

$
0
0

Dear PHP Community,

Hi all, we are delighted to share the early technical preview of the Microsoft Driver 4.1.0 for PHP for SQL Server. The new driver enables access to SQL Server, Azure SQL Database and Azure SQL DW from any PHP 7 application.

The major highlights of this release is bug fixes, new functionalities and better test coverage.

Here is a summary of the bug fixes and improvements:

  • SQLSRV_ATTR_FETCHES_NUMERIC_TYPE connection attribute flag is added to PDO_SQLSRV driver to handle numeric fetches from columns with numeric Sql types (only bit, integer, smallint, tinyint, float and real). This flag can be turned on by setting its value in PDO::setAttribute to true, For example, $conn->setAttribute(PDO::SQLSRV_ATTR_FETCHES_NUMERIC_TYPE,true); If SQLSRV_ATTR_FETCHES_NUMERIC_TYPE is set to true the results from an integer column will be represented as an int, likewise, Sql types float and real will be represented as float. Note for exceptions:
    • When connection option flag ATTR_STRINGIFY_FETCHES is on, even when SQLSRV_ATTR_FETCHES_NUMERIC_TYPE is on, the return value will still be string.
    • When the returned PDO type in bind column is PDO_PARAM_INT, the return value from a integer column will be int even if SQLSRV_ATTR_FETCHES_NUMERIC_TYPE is off.
  • Fixed float truncation when using buffered query.
  • Fixed handling of Unicode strings and binary when emulate prepare is on in PDOStatement::bindParam. To bind a unicode string, PDO::SQLSRV_ENCODING_UTF8 should be set using $driverOption, and to bind a string to column of Sql type binary, PDO::SQLSRV_ENCODING_BINARY should be set.
  • Fixed string truncation in bind output parameters when the size is not set and the length of initialized variable is less than the output.
  • Fixed bind string parameters as bidirectional parameters (PDO::PARAM_INPUT_OUTPUT) in PDO_SQLSRV driver. Note for output or bidirectional parameters, PDOStatement::closeCursor should be called to get the output value.

Future Plans

Going forward we plan to expand SQL 16 Feature Support (example: Always Encrypted), build verification/fundamental tests, bug fixes

Getting the Preview Refresh

You can find the latest bits on our Github repository, at our existing address. We provide limited support while in preview on our Github Issues page. As always, we welcome contributions of any kind, be they Pull Requests, or Feature Enhancements. I’d like to thank everyone on behalf of the team for supporting us in our endeavors to provide you with a high-quality driver. Happy downloading!

Meet Bhagdev (meetb@microsoft.com)

MSFTlovesPHP


MSDN Blogs: Early technical preview of Microsoft Drivers 4.0.3 for PHP on Linux for SQL Server Released!

$
0
0

Hi all, we are delighted to share the Preview (4.0.3) of the SQLSRV and PDO_SQLSRV driver (64bit and 32bit) for Linux. Updated Linux drivers are available for Ubuntu 15.04, Ubuntu 16.04, and RedHat 7.2.

Here is a summary of the bug fixes and improvements:

    • The PDO_SQLSRV driver no longer requires PDO to be built as a shared extension.
    • Fixed an issue with format specifiers in error messages.
    • Fixed a segmentation fault when using buffered cursors.
    • Fixed an issue whereby calling sqlsrv_rows_affected on an empty result set would return a null result instead of 0.
    • Fixed an issue with error messages when there is an error in sizes in SQLSRV_SQLTYPE_*.

    Future Plans

    Going forward we plan to improve teh quality of the Linux drivers, expand SQL 16 Feature Support for both Windows and Linux (example: Always Encrypted), build verification/fundamental tests, bug fixes.

    Getting the Preview Refresh

    You can find the latest bits on our Github repository, at our existing address. We provide limited support while in preview on our Github Issues page. As always, we welcome contributions of any kind, be they Pull Requests, or Feature Enhancements. I’d like to thank everyone on behalf of the team for supporting us in our endeavors to provide you with a high-quality driver. Happy downloading!

    Meet Bhagdev (meetb@microsoft.com)

    MSFTlovesPHP

    MSDN Blogs: Blynclight for Developers–Daddy is Working Light

    $
    0
    0

    This post will show you how to code a custom Visual Studio VSIX add-in that will control a USB light, turning the light the color of your choice when Visual Studio is in the foreground or actively debugging. 

    The code is available for download at https://github.com/kaevans/BlyncLightAddin

    Background

    A few years ago, Andrew Connell and Scott Hanselman posted about getting a USB presence indicator for Lync (Lync + BusyLight = Great Solution for the Home Office, Is Daddy on a call? A BusyLight Presence indicator for Lync for my Home Office).  I received a Blynclight as a speaker gift for speaking at a conference a few years ago and it’s just sat on my desk, turning various colors as my Skype for Business presence changes. 

    Blynclight Standard

    I am on the phone in meetings, so this does come in handy.  I started thinking that this could be really useful to tell someone I am currently coding and don’t wish to be disturbed as well.  If I am debugging, do not disturb!  Hence, the BlyncLightAddIn, where you can control what color should be shown on the Blynclight when Visual Studio starts and what color should be shown when debugging. 

    The code is available for download at https://github.com/kaevans/BlyncLightAddin

    Download the SDK

    I downloaded the Blynclight SDK from the Blynclight Developer Forum (http://blynclight.proboards.com/thread/2/blync-sdk-create-own-applications).  It required registration in order to download the SDK, but once you download it you can see the binaries, a sample application, and a Word doc that contains information on the API. 

    image

    Next, I created a Visual Studio 2015 Extensibility project.  I didn’t have the extensibility tools installed, so I was prompted to install the Visual Studio SDK.  Took awhile, but finally I had a few new project template options in Visual Studio. 

    Create a Custom Tool Window

    Once you have the extensibility tools installed (Visual Studio prompts you to install these if you haven’t already), create a new VSIX project in Visual Studio.

    image

    Right-click the project and choose Add / New Item, then add a new Custom Tool Window.

    image

    Visual Studio generates a bunch of files, including a Command class that provides a menu command item to open the custom tool window and a WPF control to define your custom tool window.  It also generates a file, *Package.cs, that loads all the stuff for your add-in.  This is where you set the default window position for your custom tool window.

    Set Default Window Pos
    1. [PackageRegistration(UseManagedResourcesOnly = true)]
    2. [InstalledProductRegistration("#110", "#112", "1.0", IconResourceID = 400)] // Info on this package for Help/About
    3. [ProvideMenuResource("Menus.ctmenu", 1)]
    4. [ProvideToolWindow(typeof(BlyncLight), Style = Microsoft.VisualStudio.Shell.VsDockStyle.Tabbed, Window = "3ae79031-e1bc-11d0-8f78-00a0c9110057")]
    5. [Guid(BlyncLightPackage.PackageGuidString)]
    6. [SuppressMessage("StyleCop.CSharp.DocumentationRules", "SA1650:ElementDocumentationMustBeSpelledCorrectly", Justification = "pkgdef, VS and vsixmanifest are valid VS terms")]
    7. publicsealedclassBlyncLightPackage : Package
    8. {

    Notice line 4 where the custom tool window is loaded, we provide the style as tabbed, meaning it will be a tab in an existing window, and the window is docked to the Solution Explorer window (the GUID value is the GUID of the Solution Explorer).  To see where I stole this code from, see https://msdn.microsoft.com/en-us/library/cc138567.aspx

    If you hit F5 right now, you have a functional yet boring addin.  Go to the View / Other Windows menu item to load your custom tool window.

    image

    Your custom tool window is opened, docked to the Solution Explorer window.

    image

    Add Blynclight Functionality

    Now that the shell of our add-in is working, we can create our custom tool window.  You previously downloaded the SDK which includes Blynclight.dll.  This file is not signed, so you won’t be able to load it.  However, there is a workaround as shown in the post Sign a .NET Assembly with a Strong Name Without Recompiling.  You basically create a strong-named key using sn.exe, disassemble the DLL to intermediate language (IL) using ILDASM.exe, then assemble the IL into a DLL including your strong-named key using ILASM.exe.

    image

    Following these steps, I then add a reference in Visual Studio to the signed assembly.

    image

    We can now program against the Blynclight!

    Is Visual Studio the Active Window?

    If I have Visual Studio open, I usually don’t want to be disturbed.  We can cheat this by registering for a callback any time the active window is changed.  Once changed, we can look at the title of the window and decide if we want to change the light’s color or not. Registering the callback is very simple, especially when someone on Stack Overflow has done the hard lifting for us!  See the post Detect active window changed using C# without polling.  I’ll be the first to admit, this is an amateurish approach to just look for a hard-coded string “Microsoft Visual Studio”… but that’s the beauty of open source, right?  Just fork the repo and implement the rest that I was too lazy to finish.

    ActiveWindowWatcher
    1. using System;
    2. using System.Runtime.InteropServices;
    3. using System.Text;
    4.  
    5. namespace BlyncLightAddin
    6. {
    7.     publicclassActiveWindowWatcher : IBlyncWatcher
    8.     {        
    9.         publiceventEventHandler StatusChanged;
    10.         privatebool isPaused;
    11.         privateconstuint WINEVENT_OUTOFCONTEXT = 0;
    12.         privateconstuint EVENT_SYSTEM_FOREGROUND = 3;
    13.         IntPtr m_hhook;
    14.         WinEventDelegate dele = null;
    15.  
    16.         delegatevoidWinEventDelegate(IntPtr hWinEventHook, uint eventType, IntPtr hwnd, int idObject, int idChild, uint dwEventThread, uint dwmsEventTime);
    17.  
    18.         [DllImport("user32.dll")]
    19.         staticexternIntPtr SetWinEventHook(uint eventMin, uint eventMax, IntPtr hmodWinEventProc, WinEventDelegate lpfnWinEventProc, uint idProcess, uint idThread, uint dwFlags);
    20.  
    21.         [DllImport("user32.dll")]
    22.         staticexternbool UnhookWinEvent(IntPtr hWinEventHook);
    23.  
    24.         [DllImport("user32.dll")]
    25.         staticexternIntPtr GetForegroundWindow();
    26.  
    27.         [DllImport("user32.dll")]
    28.         staticexternint GetWindowText(IntPtr hWnd, StringBuilder text, int count);
    29.  
    30.         privatestring GetActiveWindowTitle()
    31.         {
    32.             constint nChars = 256;
    33.             IntPtr handle = IntPtr.Zero;
    34.             StringBuilder Buff = newStringBuilder(nChars);
    35.             handle = GetForegroundWindow();
    36.  
    37.             if (GetWindowText(handle, Buff, nChars) > 0)
    38.             {
    39.                 return Buff.ToString();
    40.             }
    41.             returnnull;
    42.         }
    43.  
    44.         publicvoid WinEventProc(IntPtr hWinEventHook, uint eventType, IntPtr hwnd, int idObject, int idChild, uint dwEventThread, uint dwmsEventTime)
    45.         {
    46.             var text = GetActiveWindowTitle();
    47.             if (string.IsNullOrEmpty(text))
    48.             {
    49.                 OnStatusChanged(newStatusChangedEventArgs(Status.Default));
    50.             }
    51.             else
    52.             {
    53.                 if (text.Contains("Microsoft Visual Studio"))
    54.                 {
    55.                     //Raise event
    56.                     OnStatusChanged(newStatusChangedEventArgs(Status.Busy));
    57.                 }
    58.                 else
    59.                 {
    60.                     OnStatusChanged(newStatusChangedEventArgs(Status.Available));
    61.                 }
    62.             }
    63.         }
    64.         publicvoid Initialize()
    65.         {
    66.             dele = newWinEventDelegate(WinEventProc);//<-causing ERROR
    67.             m_hhook = SetWinEventHook(EVENT_SYSTEM_FOREGROUND, EVENT_SYSTEM_FOREGROUND, IntPtr.Zero, dele, 0, 0, WINEVENT_OUTOFCONTEXT);
    68.         }
    69.  
    70.         publicvoid Close()
    71.         {
    72.             if (m_hhook.ToInt32() != 0)
    73.             {
    74.                 UnhookWinEvent(m_hhook);
    75.             }
    76.         }
    77.         
    78.         protectedvirtualvoid OnStatusChanged(StatusChangedEventArgs e)
    79.         {            
    80.             if (StatusChanged != null&& isPaused == false)
    81.             {
    82.                 StatusChanged(this, e);
    83.             }
    84.         }
    85.  
    86.         publicvoid Pause()
    87.         {
    88.             isPaused = true;
    89.         }
    90.  
    91.         publicvoid Resume()
    92.         {
    93.             isPaused = false;
    94.         }
    95.     }
    96. }


    Is Visual Studio Debugging?

    This one was a little harder to solve, but turned out so much easier to implement.  I wanted to know if Visual Studio was actively debugging something.  If it is, then changing windows doesn’t matter (debugging usually means that new windows are popping up all over the place), just show the status as busy. 

    DebuggerWatcher
    1. using Microsoft.VisualStudio.Shell.Interop;
    2. using System;
    3. using System.Collections.Generic;
    4. using System.Linq;
    5. using System.Text;
    6. using System.Threading.Tasks;
    7.  
    8. namespace BlyncLightAddin
    9. {
    10.     publicclassDebuggerWatcher : IBlyncWatcher, IVsDebuggerEvents
    11.     {
    12.         publiceventEventHandler StatusChanged;
    13.         privateuint cookie = default(uint);
    14.         privatebool isPaused;
    15.  
    16.  
    17.         publicvoid Initialize()
    18.         {
    19.             IVsDebugger debugService = Microsoft.VisualStudio.Shell.Package.GetGlobalService(typeof(SVsShellDebugger)) asIVsDebugger;
    20.             if (debugService != null)
    21.             {
    22.                 // Register for debug events.
    23.                 // Assumes the current class implements IDebugEventCallback2.
    24.  
    25.                 debugService.AdviseDebuggerEvents(this, out cookie);
    26.             }
    27.         }
    28.  
    29.         publicint OnModeChange(DBGMODE dbgmodeNew)
    30.         {
    31.             switch (dbgmodeNew)
    32.             {
    33.                 caseDBGMODE.DBGMODE_Break:
    34.                 caseDBGMODE.DBGMODE_Run:
    35.                 caseDBGMODE.DBGMODE_Enc:
    36.                 caseDBGMODE.DBGMODE_EncMask:
    37.                     //Debugging                   
    38.                     OnStatusChanged(newStatusChangedEventArgs(Status.Busy));
    39.                     break;
    40.                 caseDBGMODE.DBGMODE_Design:
    41.                     //Debugger detached
    42.                     OnStatusChanged(newStatusChangedEventArgs(Status.Available));
    43.                     break;
    44.                 default:
    45.                     OnStatusChanged(newStatusChangedEventArgs(Status.Default));
    46.                     break;
    47.             }
    48.             return (int)dbgmodeNew;
    49.         }
    50.  
    51.         protectedvirtualvoid OnStatusChanged(StatusChangedEventArgs e)
    52.         {
    53.             if (StatusChanged != null&& isPaused == false)
    54.             {
    55.                 StatusChanged(this, e);
    56.             }
    57.         }
    58.         publicvoid Close()
    59.         {
    60.             IVsDebugger debugService = Microsoft.VisualStudio.Shell.Package.GetGlobalService(typeof(SVsShellDebugger)) asIVsDebugger;
    61.             if (debugService != null)
    62.             {
    63.                 // Unegister for debug events.                
    64.                 debugService.UnadviseDebuggerEvents(cookie);
    65.             }
    66.         }
    67.  
    68.         publicvoid Pause()
    69.         {
    70.             isPaused = true;
    71.         }
    72.  
    73.         publicvoid Resume()
    74.         {
    75.             isPaused = false;
    76.         }
    77.     }
    78. }


    Using the IBlyncWatcher Interface

    Using our two implementations above is really simple.  I defined the interface, IBlyncWatcher, that both implement.

    IBlyncWatcher
    1. using System;
    2. using System.Collections.Generic;
    3. using System.Linq;
    4. using System.Text;
    5. using System.Threading.Tasks;
    6.  
    7. namespace BlyncLightAddin
    8. {
    9.     interfaceIBlyncWatcher
    10.     {
    11.         void Initialize();
    12.  
    13.         void Pause();
    14.  
    15.         void Resume();
    16.  
    17.         void Close();
    18.  
    19.         eventEventHandler StatusChanged;
    20.     }
    21. }

    In the BlyncLightControl.xaml.cs file, you can see how easy it is to use the interface and the concrete classes.

    BlyncLightControl
    1. //——————————————————————————
    2. // <copyright file="BlyncLightControl.xaml.cs" company="Company">
    3. //     Copyright (c) Company.  All rights reserved.
    4. // </copyright>
    5. //——————————————————————————
    6.  
    7. namespace BlyncLightAddin
    8. {
    9.     using System;
    10.     using System.Diagnostics.CodeAnalysis;
    11.     using System.Windows;
    12.     using System.Windows.Controls;
    13.     using System.Runtime.InteropServices;
    14.     using System.Text;
    15.     using Microsoft.VisualStudio.Shell.Interop;
    16.  
    17.     ///<summary>
    18.     /// Interaction logic for BlyncLightControl.
    19.     ///</summary>
    20.     publicpartialclassBlyncLightControl : UserControl
    21.     {
    22.         IBlyncWatcher debugWatcher;
    23.         IBlyncWatcher activeWindowWatcher;
    24.  
    25.         ///<summary>
    26.         /// Initializes a new instance of the <see cref="BlyncLightControl"/> class.
    27.         ///</summary>
    28.         public BlyncLightControl()
    29.         {
    30.             this.InitializeComponent();
    31.             debugWatcher = newDebuggerWatcher();
    32.             debugWatcher.StatusChanged += DebugWatcher_StatusChanged;
    33.             debugWatcher.Initialize();
    34.  
    35.             activeWindowWatcher = newActiveWindowWatcher();
    36.             activeWindowWatcher.StatusChanged += ActiveWindowWatcher_StatusChanged;
    37.             activeWindowWatcher.Initialize();
    38.         }
    39.  
    40.         privatevoid ActiveWindowWatcher_StatusChanged(object sender, EventArgs e)
    41.         {
    42.             var es = e asStatusChangedEventArgs;
    43.  
    44.             switch (es.Status)
    45.             {
    46.                 caseStatus.Default:
    47.                     ChangeColor(Blynclight.BlynclightController.Color.Yellow);
    48.                     break;
    49.                     
    50.                 caseStatus.Busy:
    51.                     ChangeColor(Blynclight.BlynclightController.Color.Red);
    52.                     break;
    53.                 caseStatus.Available:
    54.                     ChangeColor(Blynclight.BlynclightController.Color.Green);
    55.                     break;
    56.                 default:
    57.                     break;
    58.             }
    59.         }
    60.  
    61.         privatevoid DebugWatcher_StatusChanged(object sender, EventArgs e)
    62.         {
    63.             var es = e asStatusChangedEventArgs;
    64.  
    65.             switch (es.Status)
    66.             {
    67.                 caseStatus.Default:
    68.                     ChangeColor(Blynclight.BlynclightController.Color.Yellow);
    69.                     break;
    70.  
    71.                 caseStatus.Busy:
    72.                     activeWindowWatcher.Pause();
    73.                     ChangeColor(Blynclight.BlynclightController.Color.Red);
    74.                     break;
    75.                 caseStatus.Available:
    76.                     activeWindowWatcher.Resume();
    77.                     ChangeColor(Blynclight.BlynclightController.Color.Green);
    78.                     break;
    79.                 default:
    80.                     break;
    81.             }
    82.         }
    83.  
    84.         ~BlyncLightControl()
    85.         {
    86.             debugWatcher.Close();
    87.             debugWatcher.StatusChanged -= DebugWatcher_StatusChanged;
    88.             activeWindowWatcher.Close();
    89.             activeWindowWatcher.StatusChanged -= ActiveWindowWatcher_StatusChanged;
    90.         }
    91.  
    92.  
    93.         ///<summary>
    94.         /// Handles click on the button by displaying a message box.
    95.         ///</summary>
    96.         ///<param name="sender">The event sender.</param>
    97.         ///<param name="e">The event args.</param>
    98.         [SuppressMessage("Microsoft.Globalization", "CA1300:SpecifyMessageBoxOptions", Justification = "Sample code")]
    99.         [SuppressMessage("StyleCop.CSharp.NamingRules", "SA1300:ElementMustBeginWithUpperCaseLetter", Justification = "Default event handler naming pattern")]
    100.         privatevoid button1_Click(object sender, RoutedEventArgs e)
    101.         {
    102.             ChangeColor(Blynclight.BlynclightController.Color.Blue);
    103.         }
    104.  
    105.         privatevoid ChangeColor(Blynclight.BlynclightController.Color color)
    106.         {
    107.             Blynclight.BlynclightController controller = new Blynclight.BlynclightController();
    108.             int numDevices = 0;
    109.             try
    110.             {
    111.                 numDevices = controller.InitBlyncDevices();
    112.                 if (numDevices > 0)
    113.                 {
    114.                     switch (color)
    115.                     {
    116.                         case Blynclight.BlynclightController.Color.Blue:
    117.                             controller.TurnOnBlueLight(0);
    118.                             break;
    119.                         case Blynclight.BlynclightController.Color.Cyan:
    120.                             controller.TurnOnCyanLight(0);
    121.                             break;
    122.                         case Blynclight.BlynclightController.Color.Green:
    123.                             controller.TurnOnGreenLight(0);
    124.                             break;
    125.                         case Blynclight.BlynclightController.Color.Off:
    126.                             controller.ResetLight(0);
    127.                             break;
    128.                         case Blynclight.BlynclightController.Color.Purple:
    129.                             controller.TurnOnMagentaLight(0);
    130.                             break;
    131.                         case Blynclight.BlynclightController.Color.Red:
    132.                             controller.TurnOnRedLight(0);
    133.                             break;
    134.                         case Blynclight.BlynclightController.Color.White:
    135.                             controller.TurnOnWhiteLight(0);
    136.                             break;
    137.                         case Blynclight.BlynclightController.Color.Yellow:
    138.                             controller.TurnOnYellowLight(0);
    139.                             break;
    140.                         case Blynclight.BlynclightController.Color.Orange:
    141.                             controller.TurnOnOrangeLight(0);
    142.                             break;
    143.                         default:
    144.                             controller.ResetLight(0);
    145.                             break;
    146.                     }
    147.                 }
    148.             }
    149.             catch (Exception oops)
    150.             {
    151.  
    152.             }
    153.             finally
    154.             {
    155.                 controller.CloseDevices(numDevices);
    156.             }
    157.         }
    158.     }
    159. }

    Again, I could have made this more user-driven, even provided a UI for the developer to choose what color happens when, if the light blinks or not, and even support multiple devices.  The Blynclight SDK comes with a sample called BlyncLightTest that is a Windows Forms solution that you can grab some inspiration from. 

    image

    Summary

    This was cathartic for me.  I really just wanted to know how to code a VSIX extension and to spend some time in Visual Studio doing something other than Azure.  I coded this in about a day (gotta love editor inheritance from Stack Overflow). 

    The code is available for download at https://github.com/kaevans/BlyncLightAddin

    For More Information

    Adding a Tool Window

    Lync + BusyLight = Great Solution for the Home Office

    Is Daddy on a call? A BusyLight Presence indicator for Lync for my Home Office

    Blync SDK – Create Your Own Blync Applications (requires free registration)

    Sign a .NET Assembly with a Strong Name Without Recompiling

    Detect active window changed using C# without polling

    The code is available for download at https://github.com/kaevans/BlyncLightAddin

    MSDN Blogs: Power BI webinars for the week of 8/14: R for the Masses and Using Power BI to Build Financial Dashboards

    $
    0
    0

     

    After a two week break, the Power BI webinars are back and with two of the most requested topics: R and Financial Dashboards.  The details about the sessions: 

     

     

    Using Power BI to build Financial Dashboards by Avi Singh

    If you work in Finance or with Financial data, it is likely that you have a love-hate relationship with Excel. The good news is, while you were not looking, Excel has been transformed. The modern Excel, teamed up with Power BI helps you quickly go from data to insights.

    This means that financial analysts, rather than wrestling with data and queries, could instead, focus on serving their customer – the business – by providing smart analysis, insights and financial guidance.

    In this webinar, we will cover a real case study for a Finance team and show you how you can leverage Excel and Power BI to

    • Combine Multiple Data Sources with Easy Refresh: Combine multiple official sources (e.g. General Ledger, Personnel) with your own custom datasets/mappings. All, while providing one-click or automated refresh.
    • Establish a Single Source of Truth: All reports (in Power BI or Excel) would connect to the Power BI Data Model as the verified source of the truth. This avoids any confusion around numbers not matching across various reports.
    • Analyze Your Data Easily: You can easily slice-and-dice your data using any Dimension or Lookup Table in the Data Model. E.g. by Product, Department, Geography.
    • Define Complex Calculations: Power BI offers a powerful formula language in DAX, which allows us to build sophisticated functionality in our reporting. Such as ability to easily select a time frame – Closed Month, Quarter-to-Date, Year-to-Date; select a specific comparison – Actuals against Budget/Forecast/Prior Year Actuals.
    • Personalized Dashboard using Row Level Security at SSAS Tabular. This shows each user a customized dashboard based on the Organization Group they belong to
    • Natural Language Q&A via Power BI or directly from desktop using Cortana

    Date: August 18, 2016  10:00 AM – 11:00 AM Pacific Time

    To register:https://info.microsoft.com/CO-PowerBI-WBNR-FY17-08Aug-16-PowerBI-Financial-Dashboards-Registration.html

     

     

    About
    Avi Singh is a Power BI trainer and consultant based out of Seattle. He is a Microsoft MVP, co-author of the top-selling book “Power Pivot and Power BI: An Excel User’s Guide” and a regular speaker at conferences and user events.

    Avi has personally experienced the transformation and empowerment that Power BI can bring, going from an Excel user to building large-scale Power BI solutions. His mission now is to share his knowledge about Power BI.

     

    You can follow him on his blog at www.powerpivotpro.com/author/avichal/ or video blog at https://www.youtube.com/powerpivotpro.

     

    R for the masses with Power BI by Sharon Laivand

     

    If you missed it, there are a new set of rich R visuals are fully integrated into Power BI service reports, and can be filtered, cross filtered, and pinned to dashboards. In this webinar Sharon Laivand will show who the new R visuals can be viewed by Power BI users without them having to be aware of this underlying technology. This will enable Power BI authors to take advantage of the thousands of ready-to-use packages, the R language  extensibility and with endless complementary capabilities for analyzing and visualizing data.​

     

    About the Speaker:

    Before joining the Power BI Team, Sharon was a senior product manager in the AD Identity & Access Management (IAM) product group defining the next generation of Microsoft’s IAM solutions. Sharon has over 20 years of experience in the software security domain and the experience of defining a few MS products in this space, such as the authentication flows in Windows Server 2012 R2 – Web Application Proxy Role.  Sharon now specializes in making R easier to use and integrating it into the Power BI suite of Products.

     

    Register:https://info.microsoft.com/CO-PowerBI-WBNR-FY17-08Aug-18-PowerBI-R-for-Masses-Registration.html

    Date: August 18, 2016  10:00 AM – 11:00 AM Pacific Time

    MSDN Blogs: The week in .NET – 8/9/2016

    $
    0
    0

    To read last week’s post, see The week in .NET – 8/2/2016.

    On .NET

    Last week on the show, we had Frank Krueger to talk about his amazing Continuous C# and F# IDE for the iPad.

    This week, we’ll speak with Francisco Monteverde about PlasticSCM.

    Package of the week: OxyPlot

    OxyPlot is an open source and cross-platform plotting library.

    Here’s how you’d plot the cos function in a Universal Windows application:

    A cosine plotted in a phone emulator

    Tool of the week: .NET API Catalog

    .NET API Catalog is a new tool that makes it easy to explore .NET APIs and figure out what support exists for each API on each version of .NET: .NET Framework, .NET Standard, Mono, or even Silverlight. I particularly like the hackable URLs that enable you to easily reach any API from its long name.

    The tool runs on Azure, with an in memory object model that is pulled from Azure Blob Storage. The deployment of the web site is fully automated and is happening each time a commit happens to our internal CoreFxTools Git repo, which is hosted on VSTS, without disrupting service.

    Game of the Week: Dreamfall Chapters

    Dreamfall Chapters is the latest sequel to the hit adventure games The Longest Journey and DreamFall: Longest Journey. The Longest Journey series takes place in two parallel universes, known as Stark, a cyberpunk future Earth, and Arcadia, a magical fantasy realm. You take on the role of two heroes as you follow their unlikely journey to save both worlds. Experience a deep and engaging story that is shaped with your decisions, and make no mistake, those decisions will have consequences that matter! Dreamfall Chapters is broken into five episodes, each of which show you how your decisions ranked against those that other players made.

    Make no mistake: the Dreamfall series is one of the very best in the point-and-click adventure genre. It has amazing and deep storytelling. A personal favorite, very highly recommended.

    dreamfallchapters

    Dreamfall Chapters was created by Red Thread Games using Unity and C#. They also use Azure for their online server. Dreamfall Chapters is available on Windows, Mac OS X and Linux via Good Old Games and Steam as well as PlayStation 4.

    User group meeting of the week: IOT RpiCar and ASP.NET Core + Docker in Bucharest, Romania

    Join the ADCES group and Victor Hurdugaci on Tuesday, August 9 at 7:00PM at the AFI PARK 2, Bucharest, Romania for a session about IOT RpiCar and ASP.NET Core + Docker.

    .NET

    ASP.NET

    F#

    Check out F# Weekly for more great content from the F# community.

    Xamarin

    Games

    And this is it for this week!

    Contribute to the week in .NET

    As always, this weekly post couldn’t exist without community contributions, and I’d like to thank all those who sent links and tips.

    You can participate too. Did you write a great blog post, or just read one? Do you want everyone to know about an amazing new contribution or a useful library? Did you make or play a great game built on .NET? We’d love to hear from you, and feature your contributions on future posts:

    This week’s post (and future posts) also contains news I first read on The ASP.NET Community Standup, on Weekly Xamarin, on F# weekly, on ASP.NET Weekly, and on Chris Alcock’s The Morning Brew.

    MSDN Blogs: How to leverage the Azure Security Center & Microsoft Operations Management Suite for an incident response

    MSDN Blogs: Department of Defense L4: FAQ

    $
    0
    0

    As more DoD customers look to leverage Azure Government cloud computing capabilities to modernize and create opportunities for agility within the mission space, there are a common set of questions that arise. We pulled the most frequently asked questions together to provide some guidance.

     

    Q: As a DoD customer, how do I acquire Azure Government?

    A: DoD customers have many options for acquiring Azure Government cloud services – either through one of our authorized Microsoft Licensing Partners  or through one of the many Microsoft Solution Partners on a range of contract vehicles including:

    • GSA IT-70
    • GSA Cloud Services OCSC BPA
    • Department of Interior FCHS
    • NASA SEWP V
    • NITAAC CIO-SP3
    • NITAAC CIO-CS
    • GSA Alliant
    • Air Force NETCENTS-2
    • Air Force NETCENTS-2 (SB)
    • Navy Seaport-e

    Customers pay only for the Azure Government services consumed – with no upfront capital investment.

     

    Q: What types of solutions can I build on the services available on Azure Government today?

    A: DoD customers are deploying a wide range of solutions on Azure Government today built on both on both our IaaS and PaaS offerings.  These include:

     

    Q: Is Azure Government approved in the DoD for production workloads?

    A: Yes, Azure Government is approved for development, test AND production workloads that are categorized by the data owner or equivalent as Impact Level 4.

     

    Q:  How is Azure Government different from competitive offerings from other commercial cloud vendors?

    A: Azure Government is an engineered solution built specifically to the needs of our Federal, State and Local government customers.  As such, we offer a number of differentiators from our competition:

     

    • Microsoft Azure Government provides a government-only network spanning multiple geographic regions greater than 500 miles apart for high availability and disaster recovery purposes.
    • Microsoft Azure Government is the only commercial, hyperscale cloud vendor with DoD Impact Level 4 Provisional Authorization offering both IaaS and PaaS services. PaaS services provide the DoD customer with significant opportunities beyond simply re-hosting to reduce both capital and operational expenditures by reducing the costs of managing and maintaining infrastructure and perpetual software licensing as well as opportunities to innovate and transform legacy applications.
    • Microsoft offers enterprise scale hybrid cloud capability that does not rely on third party vendors for hosting facilities or bolt-on solutions. Given the DoD’s investments in on-premises applications and datacenter infrastructure, it is critical that there be a glide path for migrating from on-premises to commercial cloud solutions.  ‘Cloud First’ should not and cannot mean ‘Cloud Only’.
    • Microsoft provides first party support for Active Directory, the directory services standard throughout the DoD, with Azure Active Directory. Our competition relies on 3rd party tools and deploying additional infrastructure to support directory services and identity federation.
    • Microsoft offers first-party support for workloads deployed on Azure Government that utilize other Microsoft technology (Windows Server, SQL Server, SharePoint, etc.) as well as that same level of support for Oracle, Red Hat and other 3rd party partnerships. Our competition provides ‘best effort’ support for Microsoft workloads and varying degrees of support for third-party solutions.

     

     

    Q: What Azure Government services are included in your current DoD Impact Level 4 Provisional Authorization?

    A: The following Azure Government services are currently covered by our current DoD Impact Level 4 Provisional Authorization:

    • Azure Active Directory
    • Application Gateway
    • Cloud Services
    • Load Balancer
    • SQL Database
    • Storage
    • Traffic Manager
    • Virtual Machines
    • Virtual Network
    • VPN Gateway
    • Azure Key Vault
    • Azure Web Apps
    • ExpressRoute

     

    Q: How will Azure Government make services available and compliant for government customers so that Impact Level 4 doesn’t become stale? 

    A: Our Azure Government Engineering team is committed to a continuous compliance program by which Microsoft is submitting additional services to FedRAMP and the DoD for accreditation on a regular, repeatable cadence.  Azure Government is an ‘evergreen’ platform –  introducing innovation through new and the expansion of existing services with complimentary compliance capability on a continuous basis – similarly to how we achieve these same goals in the public cloud.  Look for updates in this space related to new service announcements and compliance updates.

     

    Q: With Impact Level 4 workloads traffic must be routed through a DoD-approved cloud access point.  Does Microsoft’s Azure Government solution support this? 

     

    A: We support connectivity to multiple DoD cloud access points (CAPs) through our ExpressRoute service.  This service allows customers to create private connections between Azure Government’s datacenters and DoD network infrastructure in 3rd party colocation environments.  More information on the ExpressRoute service can be found at https://azure.microsoft.com/en-us/services/expressroute/.

     

    Q: How do I get access to try Azure Government?

    A: Request an Azure Government Trial.

     

    We welcome your comments and suggestions to help us improve your Azure Government experience. To stay up to date on all things Azure Government, be sure to subscribe to our RSS feed and to receive emails by clicking “Subscribe by Email!” on the Azure Government Blog.

     

    MSDN Blogs: Changing the automatic scheduling direction when starting a production order that has not been scheduled.

    $
    0
    0

    Historically, there has been no user option or global setup for selecting or defaulting a scheduling direction used for automatic scheduling when the scheduling step is skipped on a production order. However, KB 3070997 and CU 10 for AX 2012 have introduced new functionality that can be used to influence the scheduling direction with the start process on a production order. There is still no separate setup option, but now you can save the scheduling direction option used with the scheduling process into usage data, and that will then be used in the start process. With these changes you can now change the scheduling direction when you start a production order when the production order hasn’t been previously scheduled by going through the following steps:

    1. Set usage data for Operation scheduling. ****Note, you may need to go through this step for Job scheduling as well. It depends on your scheduling method that has been specified for Automatic updates which can be found under Production control | Setup | Production control parameters (by site), and then clicking on the Automatic update option from the left hand navigation. The start process will look to this setup to determine which type of automatic scheduling to do, and will pull the usage data from the corresponding set up form for scheduling.
      1. From the Operation scheduling form (or Job scheduling form) accessed from the Production order form (Production control | Common | Production orders | All production orders), click on the Default values button.
      2. In the Set up operations scheduling form, select the desired operation scheduling direction and other parameters.
      3. Click the User default button to save the defaults.
      4. Click the Apply button.
      5. Click the OK button to save and close the form.
      6. You can then cancel out of the Operations scheduling form without running scheduling if you are just creating the usage data.
    2. Set usage data for the Start process.
      1. From the Start form accessed from the Production order form (Production control | Common | Production orders | All production orders), click on the Default values button.
      2. Set values as desired.
      3. Click the User default button. ****Note that there is no selection for the scheduling direction, but clicking the User default button is critical as this will pull the value from the scheduling usage data set in earlier steps.
      4. Click the Apply button.
      5. Click the OK button.

    Unless you reset usage data, after going through the above steps a single time, you should be able to start production orders, and have them use the specified scheduling direction when they have not been previously scheduled without having to go through the steps ever again since the information should now be held in usage data.


    MSDN Blogs: Couchbase with Windows and .NET – Part 3

    $
    0
    0

    Editor’s note: The following post was written by Visual Studio and Development Technologies MVP Matt Groves as part of our Technical Tuesday series with support from his technical editor, Visual Studio and Development Technologies MVP Travis Smith.

    In this three part series, we’re going to look at the basics of interacting with Couchbase for .NET developers on Windows. We’ll start with the basics, and build towards a “vertical” slice of a complete ASP.NET MVC app on the .NET 4.x framework. For a deeper dive, please check out my blog posts on Couchbase and the Couchbase Developer Portal.

    In the first part, we installed Couchbase Server and went over the basics of how it works.

    In the second part, we looked at using ASP.NET with Couchbase Server.

    In this final part, we’ll implement all the CRUD functionality in an ASP.NET application.

    1. Linq2Couchbase

    Couchbase Server supports a query language known as N1QL. It’s a superset of SQL, and allows us to leverage existing knowledge of SQL syntax to construct very powerful queries over JSON documents in Couchbase. Linq2Couchbase takes this a step further and converts LINQ queries into N1QL queries (much like Entity Framework converts LINQ queries into SQL queries).

    Linq2Couchbase is part of Couchbase Labs, and is not yet part of the core, supported Couchbase .NET SDK library. However, if you’re used to Entity Framework, NHibernate.Linq, or any other LINQ provider, it’s a great way to introduce yourself to Couchbase. For some operations, we will still need to use the core Couchbase .NET SDK, but there is a lot we can do with Linq2Couchbase.

    Start by adding Linq2Couchbase with NuGet (if you haven’t already).

    To use N1QL (and therefore Linq2Couchbase), the bucket must be indexed. Go into Couchbase Console, click the ‘Query’ tab, and create a primary index on the default bucket.

    1

    If we don’t have an index, Linq2Couchbase will throw a helpful error message like “No primary index on keyspace default. Use CREATE PRIMARY INDEX to create one.”

    In order to use Linq2Couchbase most effectively, we have to start giving Couchbase documents a “type” field. This way, we can differentiate between a “person” document and a “location” document. In this example, we’re only going to have “person” documents, but it’s a good idea to do this from the start. We’ll create a Type field, and set it to “Person”. We’ll also put an attribute on the C# class so that Linq2Couchbase understands that this class corresponds to a certain document type.

    2

    After we make these changes, the app will continue to work. This is because we are still retrieving the document by its key. But now let’s change the Index action to try and get ALL Person documents.

    3

    We’ll implement that new GetAll repository method using Linq2Couchbase:

    4.1

    4.2

    In this example, we’re telling Couchbase to order all the results by Name. At this point, we can experiment with the normal LINQ methods: Where, Select, Take, Skip, and so on.

    Just ignore that ScanConsistency for now: we’ll discuss it more later. But what about that IBucketContext? The IBucketContext is similar to DbContext for Entity Framework, or ISession for NHibernate. To get that IBucketContext, we’ll make some changes to HomeController.

    5

    We’re doing it this way for simplicity, but I recommend that you use a dependency injection framework (like StructureMap) to handle this, otherwise you’ll end up copy/pasting a lot of code into your Controllers.

    Now, if we compile and run the web app again, it will display “There are no people yet”. Hey, where did that person go?! It didn’t show up because the “foo::123” document doesn’t have a “type” field yet. Go to Couchbase Console and add it.

    6

    Once we do that and refresh the web page, the person will appear again.

    1.1 A quick note about ScanConsistency

    Linq2Couchbase relies on an Index to generate and execute queries. Adding a new documents triggers an index update. Until the index finishes updating, any documents not yet indexed will not be returned by Linq2Couchbase (by default). By adding in ScanConsistency of RequestPlus (See Couchbase documentation for the details about scan consistency), Linq2Couchbase will effectively wait until the index is updated before executing a query and returning a response. This is a tradeoff that you will have to think about when designing your application. Which is more important: raw speed or complete accuracy? The Couchbase SDK defaults to raw speed.

    2. A complete ASP.NET CRUD implementation

    Let’s round out the sample app that we’ve been building with a full suite of CRUD functionality. The app already shows a list of people. We’ll next want to:

    • Add a new person via the web app (instead of directly in Couchbase Console)
    • Edit a person
    • Delete a person.

    Before I start, a disclaimer. I’ve made some modeling decisions in this sample app. I’ve decided that keys to Person documents should be of the format “Person::{guid}”, and I’ve decided that we will enforce the “Person::” prefix at the repository level. I’ve also made a decision not to use any intermediate view models or edit models in my MVC app, for the purposes of a concise demonstration. By no means do you have to make the same decisions I did! I encourage you to think through the implications for your particular use case, and I would be happy to discuss the merits and trade-offs of each approch in the comments or in the Couchbase Forums.

    2.1 Adding a new person document

    Up until now, we’ve used the Couchbase Console to create new documents. Now let’s make it possible via a standard HTML form on an ASP.NET page.

    First, we need to make a slight change to the Person class:

    7

    We added an Id field, and marked it with the [Key] attribute. This attribute comes from System.ComponentModel.DataAnnotations, but Linq2Couchbase interprets it to mean “use this field for the Couchbase key”.

    Now, let’s add a very simple new action to HomeController:

    8

    And We’ll link to that with the bootstrap navigation (which I snuck in previously, and by no means are you required to use):

    9

    Nothing much out of the ordinary so far. We’ll create a simple Edit.cshtml with a straightforward, plain-looking form.

    10.1

    10.2

    Since that form will be POSTing to a Save action, let’s create that next:

    11

    Notice that the Person type used in the parameter is the same type as before. Here is where a more complex web application would probably want to use an edit model, validation, mapping, and so on. I’ve omitted that, and I send the model straight to a new method in PersonRepository:

    12

    This repository method will set the Id, if one isn’t already set (it won’t be now, but it will be later, when we cover ‘Edit’). The Save method on IBucketContext is from Linq2Couchbase. It will add a new document if the key doesn’t exist, or update an existing document if it does. It’s known as an “upsert” operation. In fact, we can do nearly the same thing without Linq2Couchbase:

    13

    2.2 Editing an existing person document

    Now, we want to be able to edit an existing person document in my ASP.NET site. First, let’s add an edit link to each person, by making a change to _person.cshtml partial view.

    14

    We also added a “delete” link while we were in there, which we’ll get to later. One more thing to point out: when creating the routeValues, we stripped out “Person::” from Id. If we don’t do this, ASP.NET will complain about a potentially malicious HTTP request. It would probably be better to give each person a document a more friendly “slug” to use in the URL, or maybe to use that as the document key. That’s going to depend on your use case and your data design.

    Now we need an Edit action in HomeController:

    15

    We’re reusing the same Edit.cshtml view, but now we need to add a hidden field to hold the document ID.

    16

    Alright! Now we can add and edit person documents.

    This may not be terribly impressive to anyone already comfortable with ASP.NET MVC. So, next, let’s look at something cool that a NoSQL database like Couchbase brings to the table.

    2.3 Iterating on the data stored in the person document

    A new requirement is that we want to collect more information about a Person. Let’s say we want to get a phone number, and a list of that person’s favorite movies. With a relational database, that means that we would need to add at least two columns, and more likely, at least one other table to hold the movies, with a foreign key.

    With Couchbase, there is no explicit schema. Instead, all we have to do is add a couple more properties to the Person class.

    17

    That’s pretty much it, except that we also need to add a corresponding UI. I used a bit of jQuery to allow the user to add any number of movies. I won’t show the code for it here, because the implementation details aren’t important. But I have made the whole sample available on Github, so you can follow along or check it out later if you’d like.

    We also need to make changes to _person.cshtml to (conditionally) display the extra information:

    18

    And here’s how that would look (this time with two Person documents):

    20

    We didn’t have to migrate a SQL schema. We didn’t have to create any sort of foreign key relationship. We didn’t have to setup any OR/M mappings. We simply added a couple of new fields, and Couchbase turned it into a corresponding JSON document.

    19

    2.4 Deleting a person document

    We already added the “Delete” link, so we need to create a new Controller action…

    21.1

    21.2

    …and a new repository method:

    22

    Notice that this method is not using Linq2Couchbase. It’s using the remove method on IBucket. A Remove method is available on IBucketContext, but we need to pass it an entire document, and not just a key. I elected to use the IBucket, but there’s nothing inherently superior about it.

    2.5 Wrapping up

    Thanks for reading through this blog post series. Hopefully, you’re on your way to considering or even including Couchbase in your next ASP.NET project. Here are some more interesting links for you to continue your Couchbase journey:

    • There is a NET Identity Provider for Couchbase (github). At the time of this blog post, it’s an early developer preview, and is missing support for social logins.
    • Linq2Couchbase is a great project with a lot of features and documentation, but it’s still a work in progress. If you are interested, I suggest visiting Linq2Couchbase on Github. Ask questions on Gitter, and feel free to submit issues or pull requests.

    I’ve put the full source code for this example on Github.

    What did I leave out? What’s keeping you from trying Couchbase with ASP.NET today? Please leave a comment, ping me on Twitter, or email me (matthew.groves AT couchbase DOT com). I’d love to hear from you.

    Matt

    About the author

    Matt is a guy who loves to code. It doesn’t matter if it’s C#, jQuery, or PHP: he’ll submit pull requests for anything. He has been coding ever since he wrote a QuickBASIC point-of-sale app for his parent’s pizza shop back in the 90s. He currently works as a Developer Advocate for Couchbase. His free time is spent with his family, watching the Reds, and getting involved in the developer community. He is the author of AOP in .NET (published by Manning).

     

    MSDN Blogs: When you create a route card journal the resource is not defaulting as the resource group that was listed as the resource requirement.

    $
    0
    0

     

    This could be working correctly if the following factors are true:

    1. The resource requirement was a resource group.
    2. The ‘Update capacity plan’ option is enabled on the journals tab of the Production control parameters form.
    3. Previous Route card journals have been posted to meet/exceed the capacity reservations.

     

    When the above is true it could be working as designed that the first resource alphabetically defaults on to new route card journals. Normally route card journals default the resource from the capacity reservations. With the ‘Update capacity plan’ option enabled, when the capacity is completely consumed, the capacity reservations are removed, and they are no longer available for reference. When this happens the resource will be pulled alphabetically from the list of possible resources.

     

    A couple of ways around this:

    1. Unmark the update capacity plan option. Capacity reservations are then maintained and available for reference for the route cards. This is a business decision that has other implications and would therefore need to be researched and tested.
    2. Use specific resources as the resource requirement. When specific resources are used, they are then available in the production route table, and can be defaulted onto new route card journals from there. Again this is a business decision that has other implications and it should therefore be researched and tested before making any adjustments to setup.
    3. Manually update the resource on the route card.

    MSDN Blogs: Interacting with IL-generated TempDB data

    $
    0
    0

    Together with Microsoft Dynamics AX 2012, there was a new execution mode introduced; called as “IL” execution mode, which is in some references also referred as “CIL” execution mode.
    The “IL” stands for “.Net Intermediate Language” and referring to the Intermediate Language available in the .Net Technology, see also X++ Compiled to .NET CIL and X++ Scenarios that are Not Supported in CIL for more information on this topic.

    In general, the purpose of this new execution mode is to use as much as possible advantages of the performance improvements that .Net IL-execution can bring to the Microsoft Dynamics AX 2012.

    Scenario
    While we can write for several pages about the IL-execution mode, this post will only focus about one practical question, which in some given scenarios can be useful to know, specially when you are busy with the performance optimization of your X++ code.
    Here is the question:

    “Suppose you use IL-execution mode to populate some data in a tempDB-based temporary table,
    How would you access this data back from the X++ context? (i.e. to visualize in a form)”

    Demo
    Well, you might get to know the answer in this post step by step.
    First of all, there are a couple of important facts to know:

    • The tempDB temporary tables in Microsoft Dynamics AX 2012 are being created under the normal tables node in the tempDB;
    • You may retrieve the name of the physical table which keeps the data of your temp table instance in the tempDB, using the following method getPhysicalTableName();
    • You may reuse an existing tempDB temporary table instance by calling useExistingTempDBTable(…);

    This all together will form the ingredients of our sample code to demo this case.

    The following shows how you would link to an existing tempDB temporary table:

    // Sample code which shows how to link to an existing tempDB temporary table
    public static server container runStatic(container c)
    {
    container ret;
    TmpTempDBTable tmpTable1;
    int i;
    str tempTableName = conPeek(c, 1);

    // Link to an exiting table
    tmpTable1.useExistingTempDBTable(tempTableName);

    // Generate some data in the temp table
    for(i=2;i<=conLen(c);i++)
    {
    tmpTable1.ID = strFmt("Value %1", conPeek(c, i));
    tmpTable1.insert();
    }

    // send the physical table name back to be used by the caller method
    ret = [tmpTable1.getPhysicalTableName()];

    return ret;
    }

    And finally the following shows how to pass the reference name of an existing tempDB temporary table to the IL session or i.e. fetch data from an existing tempDB temporary table which is populated in an IL session:

    // NOTE: It is important to know that Xpp IL Execution is only allowed on the Server side
    public static server void runInIL()
    {
    TmpTempDBTable tmpTable1, tmpTable1_filled;
    str tempTableName;
    container data;
    container res;

    XppILExecutePermission xppILExecutePermission = new XppILExecutePermission();
    xppILExecutePermission.assert();

    // Initialize the physical table
    tmpTable1 = null;
    select firstOnly tmpTable1;

    // prepare the parameters to pass to the IL-session
    data = [tmpTable1.getPhysicalTableName(), 1,2,3];

    // go IL
    res = SysDictClass::invokeStaticMethodIL(classStr(Tst_TempDB_IL_Interaction),
    staticMethodStr(Tst_TempDB_IL_Interaction, runStatic),
    data, true);

    [tempTableName] = res;

    // process results
    // Link to the existing tempDB table

    tmpTable1_filled.useExistingTempDBTable(tempTableName);
    while select tmpTable1_filled
    {
    info(strFmt("%1", tmpTable1_filled.ID));
    }

    CodeAccessPermission::revertAssert();
    }

    Downloads
    Sample code – Interacting with IL-generated TempDB data

    Related topics
    Temporary TempDB Tables: https://msdn.microsoft.com/en-us/library/gg845661.aspx


    /*

    DISCLAIMER:

    Microsoft provides programming examples for illustration only,
    without warranty either expressed or implied, including, but not limited to,
    the implied warranties of merchantability or fitness for a particular purpose.

    This post assumes that you are familiar with the programming language that is being demonstrated
    and the tools that are used to create and debug procedures.

    */

    MSDN Blogs: You receive the error: “One or more items associated with this Item model group are controlled by processes defined for warehouse management and the use of reservation hierarchy. FIFO date-controlled reservations will not be used for these items.” when...

    $
    0
    0

    If you have the “FIFO date-controlled” option marked on the Item model group and the “Use warehouse management processes” option marked on the Storage dimension group, it would be working as designed that you are receiving this error as you cannot use the two of these options together. To verify if these options are marked, please go through the following:

    1. Go to Inventory management | Setup | Inventory | Item model groups, and review the item model group that you are trying to switch to and check to see if the “FIFO date-controlled” option is marked.
    2. Go to Product information management | Setup | Dimension groups | Storage dimension groups, and review the dimension group assigned to the item that you are trying to change the item model group on and check to see if the “Use warehouse management processes” option is marked.

    The following information can be used to work around this issue.

    Reservation on batch level is supported for where the “Use warehouse management processes” option is enabled for the storage dimension group for the item, only when batch is above location in the reservation hierarchy. And there it is supported for the process industries reservation principle, which is FEFO-date controlled by either Expiration date or Best before date. If batch is below the location, we never reserve at the batch level. Provided you are not mixing batches with different manufacturing dates on the same location, you can sort your way to find the batch number with the oldest manufacturing date to simulate the FIFO date-controlled functionality that you were trying to enable on the item model group. In the Location directives form, in the Location Directive Actions fast tab mark the Batch enabled option.

    Once the Batch enabled option is marked on the location directive, the batch table will be added to the query and you can then add the manufacturing date as sorting criteria

     

    MSDN Blogs: Introducción rápida a .NET core en Linux

    $
    0
    0

    El mundo de tecnologías para desarrolladores de Microsoft es cada vez más abierto. Hoy en día hasta hemos publicado toda una versión de nuestro framework que es totalmente open source.

    Image result for microsoft loves linux

    Ahora es fácil desarrollar no solo desde Windows, sino también desde MacOS y Linux.

    Como el título lo indica, mi intención con este post es que en menos de 5 minutos puedas apreciar lo fácil que es desarrollar con .net sobre Linux, sin pagar ni un solo peso.

    Para comenzar, sencillamente, observa este video de menos de dos minutos:

     

     

    Ahora bien, si te gustó y quieres ensayarlo en tu máquina Linux, es muy sencillo. Para la distro 16.04 de Ubuntu, sigue estas instrucciones de instalación con solo dos pasos:

    1. Ajustemos el feed para apt-get que contiene los paquetes que necesitamos:

    sudo sh -c ‘echo “deb [arch=amd64] https://apt-mo.trafficmanager.net/repos/dotnet-release/ trusty main” > /etc/apt/sources.list.d/dotnetdev.list’
    sudo apt-key adv –keyserver apt-mo.trafficmanager.net –recv-keys 417A0893
    sudo apt-get update

    2. Ya con esto podemos usar apt-get para instalar el SDK de .NET Core

    sudo apt-get install dotnet-dev-1.0.0-preview2-003121

    Ahora ensaya lo visto en el video.

    Si además quieres un editor de código moderno, OSS y además amigable con .NET Core y ASP.NET Core y que encima de todo pueda correr en Windows, Mac y Linux, te recomiendo Visual Studio Code

    MSDN Blogs: Remote Blob Storage (RBS) client library setup requirements in SQL Server 2016

    $
    0
    0

    Remote BLOB Store (RBS) is a client library with SQL Server which allows developers to store, access, retrieve binary large objects (BLOBs) outside SQL Server database files while still maintaining the ACID properties and transactional consistency of data in Blob store. RBS allows you to efficiently utilize storage and IO resources by managing both structured and unstructured data together in a SQL Server database with structured data stored in SQL Server datafiles while unstructured data stored outside SQL Server on a commodity storage solution. RBS client library exposes a set of APIs for developers to access, modify and retrieve blobs from the blob store. Each BLOB store has its own provider library, which plugs into the RBS client library and specifies how BLOBs are stored and accessed along with SQL Server database.

    SQL Server FILESTREAM feature allows you to store and manage binary data (varbinary(max)) in SQL Server utilizing the underlying NTFS storage as BLOB store.

    RBS FILESTREAM provider is one such free out-of-box provider which plugs into RBS client library to allow a deployment to use FILESTREAM enabled SQL Server Database as a dedicated BLOB store for application. The RBS FILESTREAM provider utilizes the FILESTREAM feature in SQL Server for BLOB storage and ties the two technologies together. SharePoint is one such application which allows you to use RBS FILESTREAM provider on Web Front End Servers and SQL Server FILESTREAM feature for storing and managing BLOBs on NTFS storage outside SharePoint content database as documented in MSDN article here.

    A number of third-party storage solution vendors have developed RBS providers that conform to these standard APIs and support BLOB storage on various storage platforms.

    For more details on RBS in SQL Server you can refer to the MSDN article here.

    RBS 2016 client libraries are shipped as part of SQL Server 2016 Feature pack. As a pre-requisite, RBS requires SQL Server database for storing blob metadata along with the Blob store. To connect to SQL Server, RBS requires at least ODBC driver version 11 for SQL Server 2014 and ODBC Driver version 13 for SQL Server 2016. Drivers are available at Download ODBC Driver for SQL Server. If RBS is installed on the same server as SQL Server 2016 instance, ODBC driver is already installed as part of the SQL Server installation. However when RBS is installed on separate client server like SharePoint WFE in a multi-server farm setup, ODBC driver 13.0 is not installed on the client server and needs to be installed separately as a pre-requisite for installing RBS client library.

    If Microsoft ODBC Driver 13.0 for SQL Serveris missing on the client server, the setup may fail with the following error when you try to hit test connection on the Database connection screen

    clip_image002

    If you are installing using command prompt, the output log file will show the following error,

    MSI (s) (CC:FC) [15:12:55:265]: Note: 1: 1723 2: InstallCounters 3: CreateCounters 4: C:WindowsInstallerMSI8C86.tmp CustomAction InstallCounters returned actual error code 1157 (note this may not be 100% accurate if translation happened inside sandbox)
    MSI (s) (CC:FC) [15:12:55:265]: Product: Microsoft SQL Server 2016 Remote BLOB Store — Error 1723. There is a problem with this Windows Installer package. A DLL required for this installation to complete could not be run. Contact your support personnel or package vendor. Action InstallCounters, entry: CreateCounters, library: C:WindowsInstallerMSI8C86.tmp
    MSI (s) (CC:FC) [15:12:55:265]: Creating MSIHANDLE (141) of type 790531 for
    thread 30204 Error 1723. There is a problem with this Windows Installer package. A DLL required for this installation to complete could not be run. Contact your support personnel or package vendor. Action InstallCounters, entry: CreateCounters, library: C:WindowsInstallerMSI8C86.tmp

    Hence, as a pre-requisite for installing RBS on the client server, it is important to install Microsoft ODBC Driver 13.0 for SQL Serveror higher version on the client to avoid the above error while running the RBS.msi setup for SQL 2016.

    MSDN Blogs: Bing Maps V8 July 2016 Update

    $
    0
    0

    It’s been just over a month ago since Bing Maps Version 8 (V8) was released, and today we are happy to announce the first of our regular updates to the main release branch of V8. For those not familiar with V8, it now has a new branching system which allows developers to access new features at their own pace. You can find out more about this new branching system here. This update takes all the new features and bug fixes that were added to the experimental branch in July and adds them to the main release branch. Below is a list of some of the key updates:

    Spatial Math Geometry

    The Spatial Math module has had a massive addition consisting of 24 spatial geometry calculations, bringing the total number of calculations available in the spatial math module to 47. Some of these calculations include: binary operations of shapes (intersection, difference, union), convex and concave hulls, Voronoi diagrams, shape validation and much more. Try it now.

    VoronoiDiagram

    Draggable Pushpins

    Easily move drag pushpins around on the map by setting the draggable pushpin option to true. Try it now.

    Custom Overlays

    Custom overlays allow you to create your own custom rendering layers with the map control. Why might you want to do this you ask? Several years ago one of our Microsoft Bing Maps MVPs created an open source heat map layer for Bing Maps Version7 (V7). In order to get this to work correctly, he needed to insert an HTML5 canvas into the DOM structure of the map control. There was no supported way to do this and as such a hacky solution was implemented. This solution ended up breaking 3 times over the lifetime of Bing Maps V7. With this feature we have added a supported way to create custom rendering layers so that developers can easily create and experiment with new custom data visualizations.

    TypeScript Definitions

    Last week we released TypeScript definitions available for Bing Maps V8 on GitHub. You can find the announcement here. These definitions have been updated to include all the new features that are in this release.

    Clickable Pushpin Area

    Often, when using custom pushpins, the clickable area of the pushpin is rectangular, as the image used to create custom pushpins has a rectangular shape. This can often cause issues when using mouse events because often the actual drawn image may not be a rectangle itself and as such has some whitespace around it which will block the mouse events from getting to the pushpins below it. With this in mind, V8 now lets you specify that a rounded click area should be used instead. Early testing has found that this drastically reduces false clicks on pushpins and thus creates a much better user experience. Here is an example of the difference that this makes:

    ClickArea

    GeoJSON and Query API Shape Styling

    Until now, styling of data that came from the GeoJSON module or Query API in Bing Maps V8 was limited to specifying a single default style that is used by all shapes. If you wanted to style individual shapes, you had to loop through the results and apply the logic to style each shape individually. Now, with this update, any Pushpin, Polyline or Polygon option can be specified for individual shapes through the shapes GeoJSON properties or as a data source column in the Bing Spatial Data Services.

    Mercator Map Type

    Bing Maps provides road and aerial maps which are great in most cases, but sometimes you may want to hide these all together. Perhaps you have your own custom tile layer that you want to display instead, or perhaps you simply want to view your data on its own without a map background. You can easily do this now by setting the map type id of the map to mercator.

    Bug Fixes

    Since the initial release of V8 many developers have been proactive in reporting bugs and testing the fixes in the experimental branch. With this release there are over 60 bug fixes.

    A complete list of new features added in this release can be found on the What’s New page in the documentation. We have many other features and functionalities on the road map for Bing Maps V8. If you have any questions or feedback about V8, please let us know on the Bing Maps forums or visit the Bing Maps website to learn more about our V8 web control features.

     


    MS Access Blog: Keep your email secure with Office 365

    $
    0
    0

    As the person responsible for supporting a mobile workforce, you face complicated and ever-changing threats like data privacy, security and advanced threat protection. Viruses and hacks are real threats that cause damage to businesses every day, and email protection must be at the top of your list.

    With this in mind, we created an advanced secure email service with Office 365.

    Keep your email secure with Office 365 featured image

    Here are just a few of the ways Office 365, including Exchange Online and Outlook, protects your business’s emails:

    Data privacy and security

    It’s your data; you control and manage access with incident reports, proactive controls to maintain compliance and secure mobile freedom. You’ll be confident about where your data is, who has access and what happens to it—with the added bonus of a 99.9 percent uptime commitment—and you can be sure it won’t be mined for ads or shared with third parties.

    Enterprise-level authentication and security certification

    There’s no need for additional virus software for advanced protection. When tackling external threats, use Exchange Online Advanced Threat Protection to secure email inboxes against sophisticated attacks in real time. You’re in control of internal protection: controlling access permissions with information rights management (IRM) to keep unauthorized people from printing, forwarding or copying sensitive information; and controlling transport rules, actions and exceptions with data loss prevention (DLP).

    Mobile freedom without compromise

    Odds are, someone at your business will use mobile email access; most businesses can’t work without it. Now, 93 percent of businesses have remote workers who rely on mobile technology for mobile productivity. Don’t choose between mobile access or data security; mobile device management (MDM) enables you to manage Office 365 access across devices, and other features prevent unauthorized access.

    Protecting your business, however, means more than just safeguarding your inbox from spam and viruses. Office 365 adheres to 10 privacy compliance standards and automatically provides business users with the most up-to-date apps, as soon as they launch. To learn more about these security features, check out our eBooks, “Your Business, Secured” and “5 questions executives should be asking their security teams.”

    The post Keep your email secure with Office 365 appeared first on Office Blogs.

    MSDN Blogs: How to slowdown a SQL Server Database Backup

    $
    0
    0

    Yes, you read correctly. We are not talking about accelerating a SQL Server database backup. In our customer case we encountered the challenge to slow down the full database backup execution of a SQL Server database underneath the customer’s SAP ERP system. How did we get into this situation? In short the story reads:

    • We migrated a whole customer SAP landscape from AIX/DB2 to Windows and SQL Server 2014 SP1 CU4.
    • Larger systems with BW and ERP around 9TB-10TB on DB2 originally.
    • Thanks to SQL Server database compression and especially SQL Server columnstore implemented in BW, we got the databases to around 7TB for ERP and 3.7TB for BW.
    • In order to get great performance in principle and especially for the backup, we had 64 data files per database that were located on 32 distinct volumes. Backup destinations were on 4 distinct volumes where we backed up against 16 backup files.
    • Databases were all encrypted by SQL Server TDE (https://msdn.microsoft.com/en-us/library/bb934049(v=sql.120).aspx ).
    • Backups were taken on the AlwaysOn secondary replica that was supplied in a synchronous manner.
    • The backups were done without SQL Server backup compression (https://msdn.microsoft.com/en-us/library/bb964719(v=sql.120).aspx ) since backup compression is not efficient in compressing with databases that are TDE encrypted.
    • Storage bandwidth was around 2GB/sec to the storage from each of those servers. Also from the storage backend our tests showed that the I/O bandwidth could be sustained.
    • In tests we were pretty successful in pushing those bandwidth limits in performing backups.

    So far so good. However, as all went into production the customer realized that the great backup performance did have impacts on the infrastructure. This again affected the production workload. So the task we were facing was to slowdown the IOPS the SQL Server backup activity was generating by a factor of 2. However, without changes in the disk configuration or changes to the number of backup files.

    SQL Server backup can be throttled or tuned a bit by e.g. defining the parameter BUFFERCOUNT in the backup statement. However, impact is not that extreme to get to the factor 2 less IOPS as we needed. You also can’t force SQL Server to a buffercount=1 since SQL Server will, as a minimum, allocate 2 buffers per media stream. Since we have at least a stream per disk volume, we will end up with a number of buffers. So something else needed to be found.

    So we looked into SQL Server Resource Governor (https://msdn.microsoft.com/en-us/library/bb933866(v=sql.120).aspx ). SQL Server Resource Governor introduced the capability to create resource pools that limit the I/O activity per volume (MAX_IOPS_PER_VOLUME option – https://msdn.microsoft.com/en-us/library/bb895329.aspx ). It sounded like an awesome idea to exactly limit the IOPS the backup would generate. The idea was to use performance monitor to determine the IOPS per volume and to throttle. Since we wrote against four volumes and read from 32, it certainly would be the write IOPS that we would throttle. We did not use backup compression. Hence the accumulated number of reads and writes should be the same. So we did a test and did not see any impact of anything governed. Means the backup blasted away as before. So first check, is the session we used to perform the backup even assigned to the resource group? You can find that out in the SQL Server DMV sys.dm_exec_sessions (https://msdn.microsoft.com/en-us/library/ms176013.aspx ). Check for one of the last columns called ‘group_id’. The default resource group that all client sessions go through has the value of ‘2’. It is expected that using our defined resource group, there should be a 3 digit number of >255. That was the case. Nevertheless, the I/O activity of the backup execution did not get throttled and the IOPS in writing were way beyond the limit we set. Why is that?

    Answer: Before SQL Server 2016, the I/O activity of the backup was not able to be limited with the MAX_IOPS_PER_VOLUME option of a resource pool. This only became possible now with SQL Server 2016. However, the customer was running SQL Server 2014. So a miss here.

    Another colleague in SQL Server development recommended to take a look into this article: https://msdn.microsoft.com/en-us/library/cc280384(v=sql.120).aspx . That is where we formed an idea:

    • So far we were not using backup compression since it did not make any sense to use it for a TDE encrypted database.
    • But since we are taking the backups from an AlwaysOn secondary replica, there was ample of CPU resources available.
    • So let’s use compression for the SQL Server database backup and limit the CPU that compression can leverage as described in this article. And with that indirectly slowdown the backup activity and with that the amount of I/O generated against the storage infrastructure.

    Following step-by-step what this article https://msdn.microsoft.com/en-us/library/cc280384(v=sql.120).aspx described, we created the resource pool and group, created our classifier function also based on a SQL Server login and tried again. And, it did not work. Despite the fact that we set the option MAX_CPU_PERCENT to half the CPU resource consumption the backup compression took in the exercise before, we saw no effect. Backup compression continued to take as much CPU resources as before and clearly drastically more than we had set in the MAX_CPU_PERCENT for the resource pool. And the resource group/pool was used by the session that executed the backup. Why was that?

    Answer: We did not read the article closely. Already in the first paragraph it states: ‘Therefore, you might want to create a low-priority compressed backup in a session whose CPU usage is limited by Resource Governor when CPU contention occurs‘. Looking closer into the description of this article: https://msdn.microsoft.com/en-us/library/bb895329.aspx , it states about the option we were using only coming into play when the overall CPU consumption of the server or VM is high (contention). However, in our case, we were, besides some redo thread of AlwaysOn, the only consumer and even with backup compression had ample of CPU resources left. So no CPU contention and hence no throttling.

    Solution: There is another option available when defining a resource pool. This option is named ‘CAP_CPU_PERCENT’. This option allows to cap the CPU usage of a task assigned to a resource pool to a certain percentage independent of the overall availability of CPU resources. So we gave that a try. And see it worked. Now we had it. We had a lever with which we, indirectly though, could influence the IOPS that got generated from SQL Server backup. Not exactly the direct way. But a way that can work up to the point where either the infrastructure issues get resolved or SQL Server 2016 comes in use with this customer.

    Needless to state that the user context (Windows Authentication) chosen to classify the session with executing the full database backups is a user context that only gets used for the full database backups and for nothing else. With that we avoid any other task using the resource pool.

    That was it. The strange task to slow down a SQL Server backup with SQL Server 2014. But it worked.

    MSDN Blogs: CJIS Implementation: How Microsoft Government is Committed

    $
    0
    0

    When it comes to the CJIS Security Policy, Microsoft is committed to providing law enforcement agencies with trusted cloud services that are uniquely equipped and will help meet or exceed their CJIS compliance requirements.

     

    The CJIS Security Policy provides a secure framework of laws, standards, and elements of published and vetted policies for accomplishing the mission across the broad spectrum of the criminal justice and noncriminal justice communities.

     

    While the CJIS Security Policy is to some extent aligned with NIST 800-53, Rev 4., there are unique CJIS Policy requirements which law enforcement agencies must adhere to. These include:

     

    Security Awareness Training:  The Policy requires basic security awareness training be required within six months of initial assignment, and biennially thereafter, for all personnel who have access to Criminal Justice Information (CJI) to include all personnel who have unescorted access to a physically secure location.  At Microsoft, we have enhanced our approach to request all employees with potential access to CJI be trained at the highest security awareness training level 4 prior be being assigned to support CJI and contractually commit the training will be done within 30 days rather than six months.

     

    CJIS Security Addendum:  The Policy requires all private contractors who perform criminal justice functions shall acknowledge, via signing of the CJIS Security Addendum Certification page, and abide by all aspects of the CJIS Security Addendum.  At Microsoft, all employees with potential access to CJI have signed the CJIS Security Addendum as well as Microsoft as a corporation, acknowledging the CJIS Security Policy and applicable regulations.

     

    Personnel Security:  The Policy requires all personnel who have access to unencrypted CJI, including those individuals with only physical or logical access to devices that store, process or transmit unencrypted CJI, meet the minimum fingerprint-based background checks within 30 days of assignment.  At Microsoft, all employees with access to encrypted or unencrypted CJI are screened, or in the process of being screened, within 30 days of assignment in the 22 states that Microsoft has attested to meet the applicable CJIS requirements.

     

    Formal Audits:  The Policy requires formal audits are conducted to ensure compliance with applicable statutes, regulations and policies.  At Microsoft, the State CJIS Systems Agencies with an Information Agreement shall be permitted to access the Microsoft facilities, applicable records, and Covered Entity Data, as directly related to the Covered Services.  If required, the CSA has the right to conduct on-site audits of the covered cloud services, in accordance with the CJIS Policy, to ensure Microsoft is in compliance.

     

    In summary, when you’re thinking about CJIS and digital transformation across government priorities, you should be seeking a partner committed to CJIS compliance today and in the future. Microsoft is the innovator committed to compliance!

     

    For more information on Microsoft’s CJIS compliance you can go to this article.

    For additional implementation information, review the Microsoft CJIS Implementation Guidelines.  This document provides guidelines and resources to assist criminal justice entities in implementing and utilizing Microsoft Government Cloud features.  To stay up to date on all things Azure Government, be sure to subscribe to our RSS feed and to receive emails by clicking “Subscribe by Email!” on the Azure Government Blog.

     

    MSDN Blogs: PowerShell script to enumerate deleted items in a TFS/Visual Studio Online project

    $
    0
    0

    I wanted to share the following PowerShell script (zipped script also available at the end of the post) that I often use to enumerate deleted files and folders in a TFS project hosted in my Microsoft Visual Studio Online TFS instance. Notice that the script uses the tf.exe command-line utility that gets installed with Microsoft Visual Studio. Specifically, my script expects Visual Studio 2015 to be installed in the default “C:Program Files (x86)Microsoft Visual Studio 14.0” folder. The TFS PowerShell cmdlets part of the Microsoft Visual Studio Team Foundation Server 2015 Power Tools may also be used to write a similar script.

    
    $tfsUrl = Read-Host -Prompt "URL of your Visual Studio Online TFS (e.g. https://MyTfs.visualstudio.com)"
    $tfsProject = Read-Host -Prompt "TFS Project Name (e.g. MyTfsProject)"
    
    
    if($tfsUrl -eq "")
    {
      write-output "Provided TFS url is empty. Exiting script."
      return
    }
    
    if($tfsProject -eq "")
    {
      write-output "Provided project name is empty. Exiting script."
      return
    }
    elseif($tfsProject.StartsWith('$'))
    {
      write-output "Project name must not start with a dollar sign. Exiting script."
      return
    }
    
    
    $deletedItemsCommand = '& "C:Program Files (x86)Microsoft Visual Studio 14.0Common7IDEtf.exe" dir ' + '$' + $tfsProject + ' /deleted /recursive /collection:' + $tfsUrl
    #write-output $deletedItemsCommand
    
    $deletedItems = Invoke-Expression $deletedItemsCommand
    #write-output $deletedItems
    
    
    $nl = [Environment]::NewLine
    
    
    # DELETED FILES
    
    $deletedFiles = @()
    
    foreach($item in $deletedItems)
    {
      if ($item -like "$/*:")
      {
        $parentFolder = $item.Substring(0, $item.Length-1)
      }
      elseif ($item -like "*;X*")
      {
        $pos = $item.IndexOf(";X")
        $itemName = $item.subString(0,$pos)
    
        # Folders
        if($itemName.StartsWith('$'))
        {
          continue
        }
    
        $itemPath = ($parentFolder + "/" + $itemName)
        $deletedFiles += $itemPath
      }
    }
    
    if($deletedFiles.Count -eq 0)
    {
      write-output "No deleted files found"
    }
    else
    {
      write-output ($nl + "**** FILES ****" + $nl)
    
      foreach($itemPath in $deletedFiles)
      {
        write-output ($itemPath)
      }
    }
    
    
    # DELETED FOLDERS
    
    $deletedFolders = @()
    
    foreach($item in $deletedItems)
    {
      if ($item -like "$/*:")
      {
        $parentFolder = $item.Substring(0, $item.Length-1)
      }
      elseif ($item -like "*;X*")
      {
        $pos = $item.IndexOf(";X")
        $itemName = $item.subString(0,$pos)
    
        # Folders
        if($itemName.StartsWith('$'))
        {
          $itemName = $itemName.subString(1)
        }
        else
        {
          continue
        }
    
        $itemPath = ($parentFolder + "/" + $itemName)
        $deletedFolders += $itemPath
      }
    }
    
    if($deletedFolders.Count -eq 0)
    {
      write-output "No deleted folders found"
    }
    else
    {
      write-output ($nl + "**** FOLDERS ****" + $nl)
    
      foreach($itemPath in $deletedFolders)
      {
        write-output ($itemPath)
      }
    }
    



    You may want to redirect the output of this script to a file by executing it with the following command from a PowerShell command line:

    EnumerateVsoProjectDeletedItems.ps1 > C:UsersMyUserDocumentsDeletedItems.txt

    The script will prompt you for the URL to your TFS server or Visual Studio Online TFS instance and will also prompt for the name of your TFS project.


    DOWNLOAD: EnumerateVsoProjectDeletedItems

    MSDN Blogs: Data Platform Online – Sales Training for Microsoft Partners!

    $
    0
    0

    CE Data Platform

    C+E University – Data Platform Online

    Sales Training for Microsoft Partners

     

    You are invited! As a valued Microsoft partner and vital contributor to sales and support for solutions built using the Microsoft Data Platform, you are invited to attend upcoming Data Platform Online training courses occurring in September 2016.

    Data Platform Online is: a series of online, live, interactive sales training sessions in which participants learn about Data Platform solutions and sales scenarios, resources available to partners, and guidance on how to orchestrate the sale across each stage of the Data Platform sales cycle.

    Microsoft’s comprehensive Data Platform: provides tools to capture, transform, and analyze any data of any size, at any scale —using the languages and frameworks enterprises know and want in a trusted environment on-premises and in the cloud.

    Who should attend Data Platform Online: Sales professionals within Microsoft Partner organizations.

    How Data Platform Online works: There are courses on 6 distinct topics for partners to attend, each lasting 1 hour. Each course topic will be offered on one day in September, with two delivery times to accommodate global time zones. Courses are led by Microsoft experts, and participants learn through a combination of lecture, presentation and interactive discussions.

    Course topics, dates, and times:

    Register today: Investigate course details and register to attend the ones that interest you the most.

    TopicDate/Start Times*
    Data Platform Online: Data Platform ModernizationSeptember 7

    1am or 10am AEST

    Data Platform Online: Mission Critical Application PlatformSeptember 8

    1am or 10am AEST

    Data Platform Online: Application Platform MigrationSeptember 9

    1am or 10am AEST

    Data Platform Online: The new sales tools for Data Warehouse and Big DataSeptember 14

    1am or 10am AEST

    Data Platform Online: Modern Business IntelligenceSeptember 15

    1am or 10am AEST

     Data Platform Online: Accelerating digital transformation through Advanced Analytics solutionsSeptember 16

    1am & 10am AEST

     

     

    Viewing all 3015 articles
    Browse latest View live