Quantcast
Channel: Randy Riness @ SPSCC aggregator
Viewing all 3015 articles
Browse latest View live

MSDN Blogs: The secret sauce to managing our backlog and TODO lists

$
0
0

We continue with our how has the Ranger community evolved … series, by chatting about our experiences of managing backlogs and to-do lists. If you’re looking for in-depth technical information on the tools we use, you can explore Visual Studio Team Services Agile Tools.

Where we came from and where we’re heading

We started using the Microsoft Solutions Framework (MSF), which defined principles, mindsets, governance, checkpoints and an iterative approach to solution development. At the time we needed a formal process to establish ourselves and MSF was suited to our 6-12 month roadmaps. Early 2007 we started to experiment with Scrum, by applying the framework to a project that was completely off-track. In one of the daily scrums we realised that not only was the project back on-track, but the team was energised and collaborating as one. It was that moment of insight, that inspired us to us share our experience and food for thought in the fun and easy to read .NET Enterprise Solutions … Software Engineers on their way to Pluto (aka.ms/wsbook3) book.

image

Over the years we adapted Scrum to our part-time, volunteer-based, and geographically distributed teams, defining a framework that we referred to as “Ruck”, also known as a Maul or “loose scrum” in the game of Rugby. It lead to another book, Managing Agile Open-Source Software Projects with Microsoft Visual Studio Online (aka.ms/wsbook4), in which we shared our learnings of embracing Visual Studio Team Services, Scrum and Kanban to manage our projects from a team to a portfolio level. As you’ll notice, continuous learning and sharing our experiences is a critical link in our DNA.

2015 was one of the pivotal times for the Ranger program. We switched from a rigid, well-defined and common framework, to flexible, self-managed and self-organised teams.

Our backlog(s)

There was a definite period of chaos and disorder, as we went through the process of coordinating self-organised, self-managed, and autonomous teams working together in their own way. Asking these teams for their secret sauce, is like asking them for their favourite Ketchup sauce … it varies. The order that emerged is based on the following illustration, which we loaned from our Open-Source Software Projects with Microsoft Visual Studio Online book.
SNAGHTMLbdbd026 

We create one Epic [1] for each product. It defines the minimal viable product, the sponsor, the WHAT, WHEN, and WHY. It’s owned and tracked on a program level. We create one to three [2] Features [3] per Epic, which define the features we are planning to implement the release. We collaborate on the Features at a program and team level [4], the product owner continues to be the sole person responsible for managing the Epics and Features on the Product Backlog, but the remaining backlog is owned and tracked by the team. This is where the commonality on a program level ends.

Some teams break down the Features into Product Backlog Items  (PBI) [5] and some break these down further into Tasks [6]. Most of the teams have embraced the Kanban Board as their preferred place to collaborate and monitor their project and it’s progress. It even allows team members to capture child work items on the board, as shown below, which introduces simple and an effective way  for team members to maintain a TODO list for each Feature and PBI. 

SNAGHTMLbc9b090

The visualization offered by the Kanban board allows us to identify bottlenecks, growing and idling queues. Most importantly it enables us to continuously improve the team’s process, reduce waste, and visually observe the impact of change.

SNAGHTMLbcccca5

But what’s the secret sauce?

I believe it’s the trust we place in each other!

As a Program Manager I trust that the team will self-organise and effectively self-manage their team environment, their process, and their backlogs of Features, PBIs and Tasks. In turn the team trusts the Program Manager and Product Owner to organise and manage the portfolio, defined by the backlog of Epics, guiding and keeping the team informed of business strategy changes that may impact the product. Similar to a Scrum Master, the Program Manager role has evolved into a servant-leader and enabler for the team.

Lastly, keeping it simple allows us to create a level or order and consistency, ensuring that everyone can keep focused on what’s really important … delivering value to our users.


MSDN Blogs: Experiencing Data Gaps for Availability Data Type – 11/05 – Investigating

$
0
0
Initial Update: Saturday, 05 November 2016 06:12 UTC

We are aware of issues within Application Insights and are actively investigating. Some customers may experience availability data missing for the tests running in the US: San Jose location. The following data types are affected: Availability.
  • Work Around: Customers can use different locations
  • Next Update: Before 11/05 09:30 UTC

We are working hard to resolve this issue and apologize for any inconvenience.
-Praveen

MSDN Blogs: The Microsoft Dynamics NAV Image in the Azure Gallery

$
0
0

By now, we have had a Microsoft Dynamics NAV Image in the public Azure Gallery for 1-2 years and just last week, we shipped the Microsoft Dynamics NAV 2017 image and also the Microsoft Dynamics NAV 2016 Cumulative Update 13 is now available in the gallery.

What is it?

In essence, it is just a Windows Server 2012R2 with Microsoft Dynamics NAV and SQL Express pre-installed, but there is of course more to it than that.

The Image also contains all 20 DVD images in a folder called C:NAVDVD and the image contains a series of sample PowerShell scripts in a folder called C:DEMO, which can help you setup the image for demo purposes and showcase how you can do a lot of the things, that nobody want to do manually.

Deploying the image is done like you deploy any other Virtual Machine on Azure, in the classic management portal or in the new azure portal. Both methods ends up giving you a Virtual Machine running on Azure with your very own NAV Service Tier and all clients installed and you can immediately connect to the machine and start any of the Clients on the machine.

vm1

This is of course not how you want to be demoing/using Microsoft Dynamics NAV.

You want to connect to the server using the Web Client, you want to use Tablets, Phones, Web Services, PowerBI, etc. etc.

For that, you need a few more steps.

You need to ensure that the right endpoints have been created when you deploy the machine. You also need to re-configure NAV for SSL with a certificate, open the right ports in the firewall and you need to change the authentication mechanism from Windows to something more fitted for accessing your server from the internet.

Don’t panic. you don’t have to do all of this manually. If you remember the PowerShell scripts in the C:DEMO folder, which i mentioned earlier – they are intended to help you do all of this – just run the C:DEMOInitialize Virtual Machine.ps1 script with PowerShell, answer all the questions and you will be up running.

I will create a couple of blog-posts, describing how to create the virtual machine and make sure that everything is configured the right way. For now, let me just mention the easiest way to get up running:

The easiest way

Go to: http://aka.ms/navdemodeploy

Login to your Azure Subscription. Select your subscription, resource group, location and name and a VM Admin Password. Leave the remaining fields as their defaults. Accept the terms, Pin to dashboard and select Purchase:

Note: Name needs to be globally unique.

newportal1

After 5-10 minutes of waiting for:

deploying

You will have deployed the latest Microsoft Dynamics NAV Demo Environment on Azure. The server should be up running and you can locate the URL for the Landing page at:

locatednsname

Navigate to this URL and you will have all the info you need on how to connect to your Microsoft Dynamics NAV 2017 Virtual Machine on Azure:

landingpage

Your Virtual Machine is now available in a lot of different ways:

vm2

Remember to follow the instructions on the landing page on how to install the NAV Servers self signed certificate on the device from which you want to connect to the NAV Server, else you will have certificate warnings all the time.

BTW. If you want to deploy a NAV 2016 Demo Environment you can use http://aka.ms/nav2016demodeploy which will take the latest CU from NAV 2016 and deploy.

Can I use it for production?

Yes and No.

You of course cannot use SQL Express for production, but you can setup a SQL Server on Azure or you can use Azure SQL and then configure your NAV Service Tier to use that server instead.

You also shouldn’t use a self-signed certificate for production customers, you should use a “real” certificate from one of the Trusted Root Certificate Authorities.

You can use the Virtual Machine and you can use the demo scripts for configuring the server, if you validate that they do what you expect yourself. After all – they are only sample scripts.

Does this have anything to do with Dynamics 365?

No.

This is Microsoft Dynamics NAV, the on-prem product, which also can be hosted in local hosting centers or on Azure. Microsoft is committed to an AND strategy, where we continue shipping Microsoft Dynamics NAV with everything customers and partners have loved for decades – AND – we will be shipping Dynamics 365 for Business.

Microsoft Dynamics NAV will have monthly cumulative updates (and the Azure Image will be updated accordingly), Dynamics 365 for business will follow a different update path – more like Office 365 – constantly evolving and adapting.

Enjoy

Freddy Kristiansen
Technical Evangelist

MSDN Blogs: Windows SDK archive

MSDN Blogs: Power BI – Custom Shape Map fails with non-Latin1 characters

$
0
0

Power BI – The Shape Map visual fails if a custom map contains non-Latin1 characters with the following error:

Failed to execute ‘btoa’ on ‘Window’: The string to be encoded contains characters outside of the Latin1 range.

After confirmation from the Power BI Desktop product team, this is a known limitation with the Shape Map that will be addressed in a future release. I will update this blog entry once the bug status changes.

Additional Details:

I hit an issue this past week while working with a customer in Amman, Jordan. We were exploring the different map visuals (Shape Map, Map, Filled Map, and ArcGIS Map) to compare/contrast the functionality of each visual and how they can use them in their reports. We encountered an issue when using the Shape Map with one of their custom maps when the custom map contained Arabic location names.

The Shape Map visual is outstanding with its built-in maps. For our project, we needed to use a custom map. To do this, you select the Shape Map, click on the Format tab, and then expand the Shape header. From here, you can click on the +Add Map button to supply your own TopoJSON map.

shapemap-custommap

When we added a custom map that we had exported from Esri ArcGIS, we encountered the following error:

errorhasoccurred

The details in the “Send a Frown” message contained the following error:

Failed to execute ‘btoa’ on ‘Window’: The string to be encoded contains characters outside of the Latin1 range.

After confirmation from the Power BI Desktop product team, this is a known limitation with the Shape Map that will be addressed in a future release. I will update this blog entry once the bug status changes.

Thanks,
Sam Lester (MSFT)

 

MSDN Blogs: Goofing around with the Cognitive Services Translator API

$
0
0

Recently, I was asked to get familiar with the Translator API in Azure Cognitive Services.  Not being a developer, I was a little leery of how I might approach this, but I was surprise how easy it was to tap into this functionality.  I thought I would share some of what I learned here in hopes that someone else tasked with using this or one of the other APIs in the Cognitive Services suite might find this helpful.  That said, what I’ve built here is in no way intended to represent a solution to a business problem; treat it as nothing more than a technical demonstration.

Deploying the API

The Translator API is part of a massive infrastructure managed by the Bing team.  In Azure, you provision a key which grants you access to this API.  This is done by logging into the Azure Portal (https://portal.azure.com), clicking +New from the left-hand navigation, entering Cognitive Services in the resulting search, and selecting Cognitive Services APIs from the results.

2016-11-05_9-59-28

Clicking the Create button takes you to a form where you enter a unique Account name and select your Subscription.  When you click API type, you’re presented with a list of the available APIs in the Cognitive Services suite. Choose Translator Text API from this list. For Pricing Tier, you have several options to choose from, but if you’re just playing around, the Free tier gives you a lot of mileage.  For Resource Group, you can choose to Create new or Use Existing and then choose a Location for the deployment.  Clicking Create deploys the API to your tenant.

overview

Once provisioned, access the resource from within the Azure Portal. Click on the Keys item in the left-hand navigation of its default tile.  Copy one of the two keys to your clipboard for use in next steps.

Understanding the API Methods

The Translator API you’ve deployed supports a wide range of methods.  These are documented at the bottom of this page.

As you explore each method, note whether it is employed via GET or POST.  When you expand the documentation of a method, notice the Response Content Type (identified just above the Parameters list) which typically indicates the method will return data as XML. In the Parameters list, take note of the various parameters, many of which are required, and their type.  Most parameters will be submitted via query string but most methods will require an Authorization parameter in the request header. (You can ignore the required appid parameter if you are submitting an Authorization parameter as shown below.)

methods

Calling the Translate Method using PowerShell

At this point, I suspect the mechanics of the method call are a bit fuzzy.  Hopefully, jumping into some code will solve this problem.

To do this, let’s use PowerShell.  Why PowerShell?  I can’t imagine too many scenarios where PowerShell is the right solution for making calls to the Cognitive APIs, but I like the interactive nature of the PowerShell environment.  Instead of throwing all the demo code at you at once, I’ll show it off in waves and PowerShell will allow you a bit of flexibility to play with it before moving on to the next step.

Launching PowerShell,  enter the following code:

$accountKey = "<insert the key you copied to your clipboard here>"
$tokenServiceURL = "https://api.cognitive.microsoft.com/sts/v1.0/issueToken"
$query = "?Subscription-Key=$accountKey"
$uri = $tokenServiceUrl+$query


$token = Invoke-RestMethod -Uri $uri -Method Post
$token

Running this, you should see a long token value returned from the API.

psq

With this token value, you can now assemble the header for your requests:

$auth = "Bearer "+$token
$header = @{Authorization = $auth}
$header

header

NOTE If you receive an error your token has expired, be sure to rerun the previous code to retrieve a new token and assemble a new header.  The token expires every 10 minutes.

To translate text, the Translate method requires the text you want to translate as well as a code that indicates the language you want to translate it to.  Similarly, you can tell the method the original language in the submitted text but this is optional as you can opt to have the API determine the original language for you.  There is a method to return the list of language codes but I find its easier to just look them up here.

$fromLang = "en" #English
$toLang = "es"   #Spanish
$text = Read-Host "Enter text to translate"

mytext

Using this information, you now can call the method.  The text submitted needs to be encoded and the method needs to be called with the GET verb:

$translationURL = "http://api.microsofttranslator.com/v2/Http.svc/Translate"
$query = "?text=" + [System.Web.HttpUtility]::UrlEncode($text)
$query += "&from=" + $fromLang
$query += "&to=" + $toLang
$query += "&contentType=text/plain"
$uri = $translationUrl+$query

$ret = Invoke-RestMethod -Uri $uri -Method Get -Headers $header

The $ret variable contains the XML output from the API service. PowerShell hands XML using the .NET System.Xml.XmlDocument class which exposes a bunch of properties and methods for extracting information from the XML document.  If you’d like to see the XML in its raw form, you can make use the OuterXML property on the variable:

$ret.OuterXml

outerxml

You can see from this that the XML contains a single node named string.  To access the value assigned to this node, i.e. the translated text, you can reference it using:

$ret.string.'#text'

string

Conclusion

As I said at the top, this isn’t supposed to represent a solution but instead provide a way to explore this API.  I hope this has been useful to those of you interested in working with Cognitive Services.

 

MSDN Blogs: Reporting Services Load Testing

$
0
0

 

A common ask from consultants and architects working with SQL Server Reporting Services is about capacity planning and how to validate that the environment proposed will meet the estimated workload, it is a simple question with a very complex answer because there are plenty of variables in an enterprise SQL Server Reporting Services deployment.

One of the tools we use in the development team is synthetic load generation using Visual Studio Load Tests. Simulating every detail of an interaction with the server is a daunting task, so we use synthetic workloads where we favor the flexibility of define different settings in the workload instead of an accurate 1:1 simulation of what will happen in the browser and in the native clients, the load generation is an abstraction that allows us to create workloads in the server in an agile way.

Nothing stops you of create your own set of load tests using the different tools that Visual Studio and other tools offers. However, you might find it challenging due the complexity of the different APIs and security features you will need to use in order to drive the workload.

Ladies and Gentlemen don’t hold your breath and let me introduce you to the Reporting Services Load Test project hosted in GitHub, we welcome you to try it out and contribute to it, this project take care of many or the challenges, although is not for the faint of heart as it have it’s own set of complexities, we have done our best effort in order to document how to use it and all that is written with the world acclaimed style of engineer prose in the project readme.

The project contains two sample workloads that we use in our development cycle, one called PaginatedLoad, it has a mix of paginated reports only, and MixedLoad has a combination of paginated, mobile and portal tests, you can also use the tutorials How to onboard a new Paginated Reports Scenario and How to onboard a new Mobile Reports Scenario to create your own workloads with your own reports.

In case you need to validate your topology but you don’t have any hardware, it contains an Azure ARM template that we use in our development cycle to bootstrap a minimal enterprise environment in Azure with the following machines

  • Domain Controller
  • SQL Server Engine to host the Reporting Services Catalog
  • SQL Server Engine and AS Tabular to host Data Sources
  • SQL Server installation with public DNS to configure Reporting Services

This project is very close to my heart as is something that we have had in the team for a long time but never had the chance to share it with the community, also was a challenging project as the code required a “little bit” of cleanup and changes to make it simpler (it’s still complex and it has it’s own personality and quirks as any good old seasoned project).

You might be thinking, well all that is good and fancy but what can I do with it besides overwhelming with requests my dear Reporting Services environment?

One of the scenarios is to validate that the environment that you are setting up will be able to support your estimated demand, remember that the workload is synthetic so is not at 1:1 mapping between user and test cases but is an approximation, basically you can take your existing reports, the usage pattern that you expect (from exports and rendering) and create a Load Test with them (see the tutorials and the existing load test samples in the solution).

Then you can take the Load Test and run it in your environment and monitor your CPU, memory and windows performance counters to figure out what are your bottlenecks. You can combine that with the metrics that Visual Studio Load Test will collect for every report and every scenario like:

  • Passed Tests / Sec
  • Failed Test / Sec
  • Avg. Response Time

Among many others, and use them to figure out when your environment is not able to support the load consistently, for example the image below shows a view of the of the suite reports. The test/sec and the passed tests increased nicely along with the user load until a bottleneck is reached and then the avg. response time increases and the number of tests/sec drops dramatically.

 

image

This is just one example of the validation you can do with the suite. You can experiment with different work loads, step patterns, combination of tests (actions in the portal) and after you setup one, you only have to change the config file to point it to a different server and run the same workload again (either from Visual Studio Load Test on premises or in the cloud with Visual Studio Team Services on the cloud)

 

This posting is provided “AS IS” with no warranties, and confers no rights

MSDN Blogs: Connecting Azure Data Lake Analytics/Storage with Azure HDInsight Clusters

$
0
0

Recently, it has been announced the security enhancements of Azure HDInsight of securing access to Azure Data Lake store(ADLS)/ADLA(Azure Data Lake Analytics). This options allows to data access & processing data from Azure Data Lake Analytics job to Hadoop batch processing.

In this Team Data Science process sample, where we’ve analyzed around 20 GB of data of NYC taxi trip using Azure Data Lake Analytics job , processed with ADLS (data lake store) & managing output through HDInsight hive acitivity before building multicase classification algorithm using Azure Machine Learning.

More details on provisioning secure Azure HDInsight cluster with Azure Data Lake can be found in this blog.

The ADLA job after processing looks like this.

adla_job

The output of the job is stored in Azure Data Lake store which is further processed by Hive activity in HDInsight. While provisioning the Azure HDInsight, we need to make sure to enable ‘Cluster AAD identity’ using ADLS account details.

adls-identityjpg

 

On configuring ‘Cluster AAD identity’, you need to provide ADLS access details while provisioning the HDInsight cluster like as the following screenshot along with AD service principal details.

 

provision_hdi

 

Once the external hive table is created with the underlying data from Azure Data Lake storage on NYC Taxi trip & fare dataset, we ‘ve created Azure ML models with classification, multiclass classification & regression algorithms.

 

hive

 

 

Final AML dashboard for NYC Taxi trip computation of Tip calculation looks like as this.

nyctaxi_trip

 

 


MSDN Blogs: What is a DAX Expression?

$
0
0

Basic DAX Syntax

A DAX formula is comprised of an equal sign followed by a function or expression.

  • Functions perform operations such as concatenating or adding values, calculating sums or averages, or performing logical tests. Functions usually take some kind of argument, which might be a reference to a column or table. Functions can be nested inside other functions.
  • An expression can be used to define a value that can be a literal value or constant, a Boolean test, or a reference to a column containing values. Boolean expressions can be used to define a filter condition, such as [Sales] > 100.
  • Operators within expressions, such as a plus or minus sign, indicate how the values are to be compared or processed.
  • Values that you use in formulas and expressions can be typed directly into the formula bar as part of an expression, or they can be obtained from other columns, tables, or formulas. However, you cannot reference only a few cells or a range of cells; DAX always works with complete columns or tables.

For example, the following formulas are all valid:

  • =3
  • =”Sales”
  • =’All Sales'[Amount]
  • =[Amount]*1.10
  • =PI()
  • =’FALSE’ = 0
  • =SUMX(FILTER(Sales,Region=”Europe”),[SalesAmount])

Summary:

  • The formula is the full line.
  • A function example is “SUMX”.
  • An expression can be used to define values.
  • An operator is just the sign (=, <, etc.), and a value is just the number/value being defined within the expression or used by the function.

Leave comments if you want to help clarify this, and I’ll update this article accordingly.

Thanks!

Wherever you go… BOOM! You’re there.

– Ninja Ed

MSDN Blogs: SQL Server Physical Joins

$
0
0

SQL Server Physical Join (Nested Loops joins, Merge joins, Hash joins)


Advanced Query Tuning Concepts
https://technet.microsoft.com/en-us/library/ms191426(v=sql.105).aspx

SQL Server employs three types of join operations:

  • Nested loops joins
  • Merge joins
  • Hash joins

 

Nest Loops Joins

If one join input is small (fewer than 10 rows) and the other join input is fairly large and indexed on its join columns, an index nested loops join is the fastest join operation because they require the least I/O and the fewest comparisons. For more information about nested loops, see Understanding Nested Loops Joins.

Nested Loops Join
https://blogs.msdn.microsoft.com/craigfr/2006/07/26/nested-loops-join/
OPTIMIZED Nested Loops Joins
https://blogs.msdn.microsoft.com/craigfr/2009/03/18/optimized-nested-loops-joins/

Merge Joins

If the two join inputs are not small but are sorted on their join column (for example, if they were obtained by scanning sorted indexes), a merge join is the fastest join operation. If both join inputs are large and the two inputs are of similar sizes, a merge join with prior sorting and a hash join offer similar performance. However, hash join operations are often much faster if the two input sizes differ significantly from each other. For more information, see Understanding Merge Joins.

Merge Join
https://blogs.msdn.microsoft.com/craigfr/2006/08/03/merge-join/

Hash Joins

Hash joins can efficiently process large, unsorted, nonindexed inputs. They are useful for intermediate results in complex queries because:

  • Intermediate results are not indexed (unless explicitly saved to disk and then indexed) and often are not suitably sorted for the next operation in the query plan.
  • Query optimizers estimate only intermediate result sizes. Because estimates can be very inaccurate for complex queries, algorithms to process intermediate results not only must be efficient, but also must degrade gracefully if an intermediate result turns out to be much larger than anticipated.

The hash join allows reductions in the use of denormalization. Denormalization is typically used to achieve better performance by reducing join operations, in spite of the dangers of redundancy, such as inconsistent updates. Hash joins reduce the need to denormalize. Hash joins allow vertical partitioning (representing groups of columns from a single table in separate files or indexes) to become a viable option for physical database design. For more information, see Understanding Hash Joins.

Hash Join

Hash Join

MSDN Blogs: Countdown für Connect(); 2016

MSDN Blogs: Azure SQL Veri Ambarı servisimizi 1 Ay ücretsiz deneyin

$
0
0

Azure SQL Data Warehouse

Azure SQL  Veri Ambarı’nı (Azure SQQL Data Warehouse) dakikalar içinde kurup, 2TB’a kadar veriniz ile 200 DWU’ya (data warehouse unit) kadar seviyede ücretsiz kullanabilirsiniz. Bunun için https://azure.microsoft.com/tr-tr/services/sql-data-warehouse/extended-trial/ adresindeki formu doldurarak başvurmanız yeterli.

Formun alt kısmında bu ücretsiz deneme süreci ile ilgili sıkça sorulan soruların da olduğunu hatırlatmak isterim.

 

MSDN Blogs: Power BI raporları SQL Server Reporting Services’e geliyor

$
0
0

Power BI in SSRS

Yaklaşık 1 yıl önce SQL Server ürün grubumuz Business Intelligence Reporting Roadmap‘te raporlama ile ilgili yol haritamızın nasıl olacağını yayınlamıştı. Bu yol haritasında belirtilen Power BI raporlarının SQL Server on-prem versiyonunda da kullanılacağı belirtilmiş ve Microsoft Ignite 2016’da bununla ilgili bir demo da yapılmıştı.

Azure Market Place’ten hızlıca kurup test edebileceğiniz technical preview sürümümüze https://portal.azure.com/#create/reportingservices.technical-previewpreview adresinden hızlıca ulaşıp test edebilirsiniz (Bir Azure hesabınız olması gerektiğini hatırlatayım. Eğer Azure hesabınız yoksa http://www.azure.com adresinden deneme hesabı alarak test edebilirsiniz)

 

MSDN Blogs: DIY Adjustable Laptop VESA stand with docking station mount

$
0
0

[This quick DIY steps through the process to build a VESA-mounted docking station for a regular laptop.]

 

I got fed up with my HP ProBook 640 G2 laptop sitting in a docking station on my desktop taking up valuable space. My big desktop monitors are raised above the desk surface so why not the laptop? I found all sorts of laptop stands on the market which raise the laptop up, but none are designed with docking in mind. I wanted to be able to snap my laptop into the stand with the screen at eye-level without having to fumble with a bunch of cables and without affecting the docking station warranty.

So I built my own:

img_0671

 

Materials:

  • Generic VESA monitor arm (~$20)
  • HP 2013 UltraSlim Docking Station (~$130)
  • 2 1/4″ acrylic sheets (~120mm x ~400mm) (~$15)
  • fasteners ($4)
    • 6 M4 16mm bolts
    • 6 M4 washers
    • 6 M4 nuts
    • 2 M3 16mm bolts
    • 2 M3 washers
    • 2 M3 nuts

Tools:

  • screw driver
  • pliers
  • heat gun
  • drill
  • Dremel rotary tool (optional)
  • clamps and wooden blocks (optional)

 

Steps:

 

Upon examining the bottom of the HP UltraSlim dock, I found two hanger areas that can be used to securely fasten the dock to the stand. Nails? Velcro? Nope. I used M3 screws, washers, and wing nuts since that’s what I had in my parts bin.

img_0656 img_0661

I had a scrap piece of acrylic which I bent with a little heat gun that my wife uses for embossing. I had never bent acrylic at home before, but found several guides on the Interwebs. 1/4″ acrylic is thick enough that I needed to alternate heating both sides before starting the bend. Took less than 5 minutes to get a  perfect 90° bend!

img_0659img_0660

I used the VESA plate from the generic monitor arm as a template and drilled 4 holes. Here I’m using the 50mm x 50mm VESA template because my scrap piece of plastic was small. I recommend using the 100mm x 100mm template instead. I fastened the VESA plate to the plastic with 4 M4 nuts/bolts.

img_0663img_0664

The second piece of plastic doesn’t need any bends. It just needs carefully drilled holes to match up to the hanger locations on the HP Docking station. I used M3 nuts/bolts here.

img_0662img_0665

After test-fitting, I drilled additional holes to allow fasteners to hold the two plastic pieces together.

img_0666img_0667

Looking back, it would have been cleaner to use a single piece of plastic shaped like a “T” or a “+” rather than fasten two pieces together. But I didn’t want to make a trip to the store and these are the bits I had in my scrap parts bin.

img_0668img_0668a

Attach the VESA plate to your VESA arm and test fit the laptop. On my first try I didn’t leave enough room to fit the USB cables behind the dock. So I had to drill a few extra holes to position the laptop further away from the arm.

img_0669img_0671

I got it right the second time. 1/4″ acrylic is surprisingly durable but for those that are weary of the plastic snapping with prolonged use, it would be simple to make this out of metal or wood or add some supports to strengthen the base.

img_0672img_0670

 

I used an HP laptop and dock, but this ought to work for any Enterprise series laptop like Lenovo/Toshiba/Dell that has a snap-in docking connection.

 

Back to the main blog

MSDN Blogs: ライブ、そしてイベント。そして de:code へ

$
0
0

数年ぶりにライブを見てきた。

なんとなくライブというものを避けてきたのだが最近では一番気に入っている人だったので行ってきた。素晴らしかった。ヘッドフォンで聞いているのとまるで違う。勿論そんなことはわかっていたが。そして翌日ベストアルバムを聞くと感じ方が違っていた。いろいろな感情を巻き込んで聴こえてきた。そして、それが彼女のラストライブであることを本当に残念に思った。

eir

ライブである事

勿論音楽なら音響も、構成もアレンジも異なる。CDとは違う。がそれも魅力である。当日その空間だからこそ伝えられるものを盛り込む。本当の良さを伝える。それは単一の楽曲売りでも、ライブDVDでも作れない。その空間を共にしたからこそ得られる、伝えられるものがある。

そして人はそれに心を動かされる。

そして、自分たちも同じであることを思い出した。技術と言うコンテンツ。自分が心底、伝えたいと思う技術を伝えるためにライブではないが、セミナーやイベントを実施する。今の時代、ライブストリーミングもできるしセッションだってログがあればオンデマンドで見ることもできる。だけど、やはり当日その場にいて伝えられるものは違うんじゃないかとつくづく思う。

過去最高ともいうべきのTech Summit の評価

先日のTech Summit。我々はいつも参加された方々から、評価をいただきます。

img_4522

イベント全体、キーノート、ブレイクアウトセッション。どのイベントでも同じ評価基準で評価をいただく。これはMicrosoft全世界のイベントで実施している。そして、ふたを開けてみれば、Tech Summit は素晴らしい評価をいただいた。イベント全体はもちろん、キーノート、ブレイクアウトセッション。

多分各セッションをライブストリーミングでやったり、小さいセミナーでやってもこれほどの票ををいただけるのは難しいのではないかと思う。(勿論評価を取れる人はどこでも取れると思いますが)やはりその日に合わせて凝縮した内容を準備して、その空間で参加者と対話をするように、自分たちの伝えたい技術を伝える。これが持つ力は大きいと思う。

日々の仕事でそういった事を手を抜いているわけではない。そうではないが、あの場でしか伝えらえない「空気」のようなものがあると思う。

de:code 2017

2017 5/24,25 開発者とIE エンジニア向けのイベント de:code を開催します。

ky_decode-01

もしかしたらセッションで話す情報はすぐに公開されてしまうかもしれません。でも、エンジニアとして新しい技術に対して思う何かがあるのであれば、期待するものがあるのであれば、ぜひご参加いただきたいです。

そして我々だけでなく、参加されるパートナー企業様も伝えたい思いや技術、製品がたくさんあると思います。ご参加いただくだけでなく、よりお客様にそれを伝えるためにはどういった事をした方がよりよくなるか、ご意見等があればぜひお寄せいただきたいです。勿論我々も毎回考えて実施します。

これから楽しむ

まぁ、イベント終わって、ライブに言って楽しんできて、ちょっとテンション上がっているだけなのかもしれないですが。改めてこんなことを考えていました。何をいまさら、なのかもしれないんですけどね。

自身もう一度初心に帰って、Tech Summit を超える de:code を作り上げたいと思っています。


MSDN Blogs: Connecting to NAV Web Services from PHP- take 2

$
0
0

Back in the days – January 19th 2010 to be exact, I wrote a blog post on how to connect to NAV Web Services from PHP here.

Web Services was fairly new and there wasn’t a lot of info out there on how to do things.

NTLM and SPNEGO

The biggest challenge was authentication. NAV was using Windows authentication with SPNEGO in NAV 2009 and in 2009SP1 we added a special key in the CustomSettings config file called WebServicesUseNTLMAuthentication, which you could set to true and allow products like PHP to connect to NAV Web Services.

PHP didn’t have native support for NTLM, but the post above describes how to use a custom stream to allow PHP to work.

Today

Since then, NAV has added different authentication providers and today, I wouldn’t recommend using Windows Authentication if you are setting up a Web Services for external access. Even if you are using NAV on-prem with Windows Authentication, it is fairly easy to setup an additional Service Tier, with NavUserPassword, which can be used for Web Services, allow access through firewalls.

Setting up a Service Tier with NavUserPassword will allow Web Services to be accessed using Basic authentication, which is natively supported by virtually all programming languages. Basic authentication is where username and password is sent over the wire and if you use this for anything but demo purposes, you of course want to be using a SSL certificate to secure the communication.

A lot of people have asked me about connecting to NAV Web Services from PHP since then and my response have always been, that it is straightforward. Just remove all the code that has to do with NTLM authentication from the old blog post and you should be fine.

Today I decided to prove myself right:-)

Setting up a test NAV server

This blog post describes how to setup a demo environment using the NAV Image in the Azure Gallery. This is by far the easiest way to get started and I will use this as our starting point.

NAV is setup for NavUserPassword authentication, firewalls is open, endpoints have been added and a self-signed certificate is added to secure the communication.

Installing PHP

For testing purposes, I will install PHP on the same server. I will still communicate to the public dns name of the Virtual Machine instead of localhost to ensure that communication is flowing like it would be if you are NOT on the same box.

PHP is available in the Web Platform Installer. Select the right version and install.

web-platform-installer

The installer will install PHP, including pre-requisites and configure it with IIS. Machine might need a reboot.

Visual Studio Code

As editor, I will use Visual Studio Code. It is lightweight and is better than notepad when it comes to PHP code editing.

After installing Visual Studio Code, I added the PHP Code Format extension:

vscode

Hello World

In order to check that PHP was installed and configured correctly, I created a small file called helloworld.php in the folder C:inetpubwwwroothttp. This folder is the folder, which contains the landing page and is accessible both from within the VM, but also from outside using a browser.

My hello world code looks like this:

<html><head><title>PHP Hello World</title></head><body><?php  echo "<p>Hello World</p>"; ?> </body></html>

and connecting to the Web Site reveals that things are up running:

helloworld

Connecting to NAV…

Next thing I did was to take the code from the “old” blog post and strip it from all NTLM stuff, using only standard SoapClient available in PHP and it was exactly as easy as I always thought it would be (Well… – sort of:-))

<html><head><title>Connecting to NAV Web Services from PHP</title></head><body><?php$login = "<username>";$password = "<password>";$baseUrl = "<baseUrl>"; // ex. https://<servername>.cloudapp.net:7047/NAV/WS/$context = stream_context_create(['ssl' => [ // set some SSL/TLS specific options'verify_peer' => false,'verify_peer_name' => false,'allow_self_signed' => true ]]);$options = array("login" => $login, "password" => $password, "features" => SOAP_SINGLE_ELEMENT_ARRAYS, "stream_context" => $context);$client = new SoapClient($baseUrl."SystemService", $options);// Find the first Company in the Companies $result = $client->Companies(); $companies = $result->return_value; echo "Companies:<br>"; foreach($companies as $company) {  echo $company."<br>"; }$cur = $companies[0];$pageURL = $baseUrl.rawurlencode($cur)."/Page/Customer";echo "<br>URL of Customer page: $pageURL<br><br>";$page = new SoapClient($pageURL, $options);$params = array("No" => "10000"); $result = $page->Read($params); $customer = $result->Customer; echo "Name of Customer 10000:".$customer->Name."<br><br>";$params = array("filter" => array(  array("Field" => "Location_Code", "Criteria" => "RED|BLUE"),  array("Field" => "Country_Region_Code", "Criteria" => "GB")  ), "setSize" => 0 ); $result = $page->ReadMultiple($params); $customers = $result->ReadMultiple_Result->Customer;echo "Customers in GB served by RED or BLUE warehouse:<br>"; foreach($customers as $cust) {  echo $cust->Name."<br>"; } ?> </body></html>

I did actually change two things in the sample from 2010:

  1. The stream_context for ignoring self-signed certificate warnings (or errors)
  2. The Soap_single_element_arrays which ensures that Soap returns an array even when only one element is returned (hat tip Dirk)

The result

Opening the web site in a browser gave me the result I was looking for

result

This example actually runs on a NAV 2016 CU13 virtual machine on Azure and connects to NAV on a NAV 2017 virtual machine, but was also tested on NAV 2016.

OData and other stuff

I haven’t focused on how to download OData feeds and other stuff in PHP. This is more of a PHP challenge than it is a NAV challenge. NAV today uses standard authentication and setup with the right certificates and configurations, connecting to NAV OData Services should be straight forward.

 

Enjoy

Freddy Kristiansen
Technical Evangelist

MSDN Blogs: Lync 2013 (Skype for Business) 2016 年 11 月の更新がリリースされています。

$
0
0

こんにちは。Japan Lync/Skype サポート チームです。
Lync 2010 の 2016 年 8 月の更新プログラムがリリースされています。

November 1, 2016, update for Lync 2013 (Skype for Business) (KB3127934)
https://support.microsoft.com/en-us/kb/3127934

更新内容については以下の技術資料をご確認ください。



免責事項:
本情報の内容 (添付文書、リンク先などを含む) は、作成日時点でのものであり、予告なく変更される場合があります。

 

MSDN Blogs: MEC Monday: Curated STEM Resources

$
0
0

Over the last few weeks we’ve been taking a look each Monday at a different course, resource or activity from the Microsoft Educator Community. There is something for everyone on the MEC, no matter what subject or age group you teach, with courses and activities varying in duration, so regardless of how much time you have to commit, you’ll find something to do to further your own CPD or to engage your students.

So far we have covered:

This week we’re turning our focus to a collection of materials with in the MEC, all related to STEM.

Microsoft Educator Community: Curated STEM resources

STEM (Science, Technology, Engineering and Mathematics) is more than a buzz word- it is a paradigm shift in education. STEM places the student at the centre of learning- enabling them to question, interact and build the world they see as well as the world they want to see.

With so many options, it can be difficult knowing where to start or what’s new. This carefully curated STEM resource page is updated regularly to help all levels of educators from those who are just starting out or those wanting to delve deeper into adopting STEM curriculum.

mec-stem

Head over to the Microsoft Educator Community to browse all of the contents contained within this collection of specially curated STEM resources. But to give you an idea of what you’ll find, here’s an extract from the one of the sections of the MEC page:

10 Free Ways to STEM the Gap

Would you like to learn about ways other teachers and industry leaders are making connections using STEM education across the curriculum? Join Microsoft Fellow Todd Beard as he discusses how we can work together to “STEM the Gap” with hands-on tools and activities.

Todd will cover why it is so important to immerse our students in the integration of Science, Technology, Engineering, and Mathematics. He will also explain how to integrate STEM tools in the classroom so that students can create and collaborate with others. In addition, Todd will share some great, free STEM tools and resources available to students, teachers, and parents, and show how the Microsoft Education Community can help. Come discover all of this and more!


If you’ve not yet signed up for the MEC, then it is free and easy to do so. Just head to education.microsoft.com and click on the ‘JOIN NOW’ button in the top right hand corner to get started.

MS Access Blog: First free online OneNote conference aims to engage a global audience

$
0
0

Join the first-ever online Learn OneNote Conference, where for six days experts from across the globe will showcase their OneNote tips, tricks and best practices. During the conference, 21 speakers will share—via video—how they use OneNote at home, at work and in education. These pre-recorded videos will be released at scheduled times throughout the conference. Register now so you can view the videos at your convenience for free for the duration of the conference. The conference begins November 12 and concludes on November 17.

Over 20 OneNote expert speakers are lined up

Organized by OneNote super-fan Jared DeCamp, the Learn Online Conference features 21 expert speakers from all over the world. Speakers will provide in-depth videos on OneNote topics ranging from showing how a deputy district attorney uses OneNote as a replacement for a three-ring binder­ and a small business owner tracks expenses with OneNote, to how a mother makes her life less chaotic by planning her house remodel in OneNote.

The last two days will include educators sharing how they plan their lessons, maximize OneNote Class Notebooks and how students are provided effective feedback on assignments with audio and text. Each of the six days will include three to four sessions focused on a specific OneNote theme.

The daily themes are:

  • Day 1: Getting started with OneNote
  • Day 2: Powerful OneNote features
  • Day 3: OneNote for life
  • Day 4: OneNote for business
  • Day 5: OneNote for education
  • Day 6: OneNote for education (continued)

See this Sway for speaker information and schedule details:

Submit your own tip video in your own language

We know there are thousands of OneNote Ninjas across the globe who are all using the product in their own way. We invite you to create a five to 15-minute video in which you give us your best OneNote tips. The important part of this initiative is: you may record this video in your own language, making this a truly international event.

Videos have already been submitted in Cantonese, Armenian and several in English; but we need more languages represented. ¿Cómo se utiliza OneNote?

View this Sway for details on how to create and submit your video so that all conference participants can watch it:

Showcase video on embedded objects

On day two, the online event will be all about the most popular OneNote features, such as page templates, tags, view options and workflow. Embedding live objects in OneNote pages has become another much-loved feature since its launch in November 2015. On OneNote Central (@OneNoteC), Marjolein Hoekstra keeps a close eye on its developments and frequently announces new discoveries.

Today, we’re excited to release her video about embedded live objects ahead of time, so that you can catch a glimpse of what the conference will be like. Have you ever wanted to insert a video playlist into your OneNote page? Maybe a Sway or a PowerPoint presentation? Marjolein also points out brand-new content types that have not officially been announced yet.

Get a copy of Marjolein’s Embedding Live Content resource notebook hosted on Docs.com. It contains all the examples discussed on Marjolein’s video and several resources related to the same topic. If you want to see more videos like Marjolein’s, be sure to sign up for free at the Learn OneNote Conference site today.

Put your pin on the map

We believe OneNote plays a significant role in Microsoft’s global mission of empowering every person on the planet to achieve more. Attending this conference does not cost any money, require any travel or force you to take a big risk. So, how about taking action and becoming a part of this unique event?

Whether you are joining as a speaker, contributor or participant, feel free to push your pin on the global map:

How do I register?

To register, visit the Learn OneNote Conference site and click Register for Conference. Be sure to confirm your email address to receive links to all the conference videos and resources.

Microsoft in Education OneNote events

Did you know Microsoft in Education organizes monthly #OneNoteQ TweetMeet events? Every first Tuesday of the month at 10 a.m. and 4 p.m. PT, OneNote experts and Microsoft Innovative Educators are ready to discuss a specific topic with you. To join, follow #OneNoteQ in your Twitter client. If you use TweetDeck, simply click the looking glass icon on the top left, enter #OneNoteQ as your search query and add the column. The column will automatically refresh when new tweets tagged with #OneNoteQ arrive.

—The OneNote team

About Jared DeCamp—Learn OneNote Conference host

jared-decamp-pro-pixJared DeCamp is the host of the Learn OneNote Conference. As a real-estate appraiser, Jared uses OneNote every day and wants to share his OneNote knowledge, insights and wisdom. Register for the conference, share the details with your peers and follow Jared on Twitter, @Jared_DeCamp.

From Marjolein Hoekstra (OneNote Central @OneNoteC)

marjolein-pro-pixWhen Jared first told me about the conference, I immediately knew this would be a success. Within days, the speaker list was filled. I’m impressed with all the work that Jared and all the speakers have pulled off to make this event happen. As a OneNote trainer and as content curator at OneNote Central, I frequently speak with professionals. When they first learn about OneNote, the one expression I hear the most is: “OneNote, where have you been all my life?” This conference will greatly help us all to learn more about what you can do with OneNote.

Frequently asked questions

Q. Who organized the Learn OneNote Conference?

A. The conference is being organized and hosted by OneNote super-fan Jared DeCamp.

Q. How can I get involved?

A. Register your participation on the Learn OneNote Conference website. Don’t forget to confirm. If you like, submit a video in your own language. It would be great if you would announce the event to your own followers on social media.

Q. What is the hashtag for the conference?

A. The hashtag for this conference is #ONconf2016.

Q. What happens after the event?

A. After the event, access to the videos will be exclusive to participants who have purchased a Lifetime Access Pass. This is also a great way to catch up with videos you haven’t been able to watch. Details about this are sent to you after registration.

Q. Who and what should I be following on Twitter?

A. Here are all the Twitter accounts you can follow:

  • @MSOneNote—Official Microsoft OneNote account
  • @OneNoteEDU—Official Microsoft OneNote in Education account
  • @Jared_DeCamp—Organizer and host of this conference
  • @OneNoteC—OneNote Central, Marjolein’s account with daily tips and pointers about OneNote

Q. Where can I send ideas and feedback that I have about this initiative?

A. Reach out to Jared on Twitter: @Jared_DeCamp.

The post First free online OneNote conference aims to engage a global audience appeared first on Office Blogs.

MS Access Blog: A conversation about Working Out Loud Week

$
0
0

Today, we are featuring an interview with Simon Terry from Change Agents Worldwide—a network of professionals specializing in future of work technologies and practice—and Angus Florance, from the Product Marketing team at Yammer. Members of Change Agents Worldwide actively practice, consult on and advocate Working Out Loud.

Angus:Simon, you are an advocate of working out loud and a co-founder of International Working Out Loud Week, what is working out loud about?

Simon: Bryce Williams of Eli Lilly created the initial definition of working out loud as “Visible Work + Narration of Work.” This definition highlights the importance of transparently sharing work in progress as it happens so that others can get insight into your goals and approaches. John Stepper, a leading global expert in collaboration and working out loud, has built on this and extended the definition through his book and blog as:

Working out loud is an approach to building relationships that can help you in some way. It’s a practice that combines conventional wisdom about relationships with modern ways to reach and engage people. When you work out loud, you feel good and empowered at the same time.”

Angus:Why are we talking about working out loud in Yammer now?

Simon: Well, maybe because this week is International Working Out Loud Week (November 7–13). Seriously though, working out loud is something we have always done though we used different terminology. The concept is not new. What has changed is the organizational priorities. Digital transformation is a key priority for organizations. To leverage the potential of digital technologies requires new ways of working as well as underlying changes to organizational culture. We have already seen organizations adopting a range of new digital ways to work like agile, design thinking, collaboration and more. What these practices have in common is a focus on making work in progress more visible so stakeholders can contribute and learn from the work. The practice of making work in progress more visible in a purposeful way is working out loud. Yammer in the Office 365 Suite offers a great way for organizations to enable their employees to benefit from working out loud.

Angus: When people hear the idea of sharing work in progress, lots of them think that is the last thing that they would want to do. What do you say to that?

Simon: I hear that comment a lot and I understand that concern given the way traditional organizations have taught us to work. Working out loud is not always a comfortable practice for people in our organizations. Over the years, many people have learned that the best ways to work are to perfect their work output in isolation and only share a finished product when they are ready to collect the glory. Too often we see that this approach results in wasteful work, missed expectations and rework. Overcoming this means helping people change their behavior by encouraging them to practice sharing their work early and often.

Angus:How can organizations help people overcome barriers to better leverage working out loud on Yammer?

Simon: In John Stepper’s approach to working out loud there are five key elements of working out loud that need to be the focus of the change management to enable people to overcome their reluctance to share work and to benefit. Organizations need to foster these elements and Yammer offers a perfect platform to enable these five elements throughout your organization:

  • Relationships—Yammer is a platform that provides visibility to many more people across the organization. Rather than only relying on a hierarchical org chart and the official communication channels, Yammer enables people to engage everyone interested in and involved in their work.
  • Generosity—The daily back and forth of sharing in Yammer reinforces the culture of generosity in the organization. Having the proper role models to reinforce the benefits of generous sharing will also help promote this practice within your organization.
  • Visible work—Yammer works best when the culture of the organization supports transparency with two-way conversations and open groups so information is not hidden in silos. Yammer also enables people to find people, work and information that are relevant to their goals and challenges. Integration to tools like Delve takes this further by using the power of algorithms to bring relevant documents to employees. Visibility of work in Yammer enables greater peer accountability and fosters greater trust in your organization. This lays the path for the greater employee autonomy that the digital workplace demands.
  • Purposeful discovery—Between our challenges and our lack of time, purpose can get lost in our organizational conversations. Yammer as a platform gives people an opportunity to start a conversation about the purpose or effectiveness of work and to align with others around shared purposes, through groups. In this way, working out loud in Yammer can deliver organizations significant benefits in strategic alignment and engagement.
  • Growth mindset—A learning mindset is key to individual and organizations success, particularly in the digital era. Yammer offers a platform for social learning and for experiments, encouraging employees to empower themselves with new connections, new learnings and new information to improve their work performance.

Angus: So if I want to get started on working out loud, what are some simple approaches to work out loud on Yammer.

Simon: Getting started with working out loud in Yammer is easy. Here are many ways you can begin working out loud in your daily practice at your organization:

  • Share your progress or, better yet, ask a question about a piece of work in progress in a relevant Yammer group: Take one piece of work that you have underway. Find a Yammer group that is relevant to the success of that work. Share information about your work, ask for feedback or ask a question.
  • Start a group for a team, a project or a key group of stakeholders to work out loud together: Choose a group of people who have a shared purpose in their work. Explain to them why you think there are benefits of working out loud. Run a pilot of sharing your work and learning from each other.
  • Build a habit with the three daily practices of working out loud: Working out loud is a habit. Start with some triggers to practice daily. First thing in the morning, share in Yammer a challenge from your day ahead. At lunch time, join a Yammer conversation and respond. At the end of the day, use Yammer to celebrate someone’s achievements. We can all do with more celebration.
  • Support people with Working Out Loud Circles: John Stepper’s Working Out Loud Circles are a 12-week peer-supported process to enable people to achieve a personal goal through working out loud. Run a pilot of working out loud circles in your Yammer network.

Angus: Where can people learn more about working out loud if they want to take things further?

Simon: There are several free resources available to help people get started with working out loud:

  • John Stepper has a fantastic site: Workingoutloud.com has a blog and materials to get started with Working Out Loud Circles.
  • Make the most out of International Working Out Loud Week: Wolweek is a global conversation about working out loud. Learn from others.
  • Join the Microsoft Tech Community: In the Microsoft Tech Community, you will find peers who do similar work. It is a great place to share and to learn about the opportunities in Yammer.

Angus: Thanks, Simon. It was a pleasure speaking with you today, as always.

—Angus Florance

The post A conversation about Working Out Loud Week appeared first on Office Blogs.

Viewing all 3015 articles
Browse latest View live