Archive for the ‘SQL’ category

Docs.com – New Microsoft documentation portal

August 6th, 2017

Microsoft Corporation has provisioned a new portal for all their application/framework documentation and are also placing the ability to add to that content.

The new docs.microsoft.com rolled out recently for their documentation for each of the following topic groups arranged by product line/focus:
• SQL
• Windows
• Microsoft Azure
• Visual Studio
• Office
• .NET
• ASP.NET
• Dynamics 365
• Enterprise Mobility + Security
• nuget
• Xamarin

So no longer navigate to msdn.microsoft.com. The central portal leads to additional functionality and information sharing not previously offered.

3 new Microsoft tools to help you to move to the cloud.

April 18th, 2017

Here’s a breakdown of the three new Microsoft tools to help you move to the cloud faster and what they can offer businesses.

1. Free cloud migration assessment

This assessment will help customers to more easily find and to better understand their current server setups, to help them to determine the cost and the value of moving to the cloud. Once the servers are discovered, the tool can analyze their configurations, and give the user a report of the potential cost drop of moving to Azure.

Data center administrators can export the results of the assessment into a customized report. The report could provide some valuable data and statistics for a CIO conversation with the CFO.

2. Azure Hybrid Use Benefit

This tool should save users money on their cloud deployments. Customers can activate the Azure Hybrid Use Benefit in the Azure Management Portal,It is available on Windows Server virtual machines in Azure, to all customers. “Use your on-premises Windows Server licenses that include Software Assurance to save big on Windows Server VMs in Azure. By using your existing licenses, you pay the base compute rate and save up to 40 percent.” the tool’s web page said,

3. Azure Site Recovery

Azure Site Recovery is meant to ease the process of migrating virtual machines to Azure. Applications running on AWS, VMware, Hyper-V, or physical servers can be moved. Additionally, a new feature in Azure Site Recovery will “allow you to tag virtual machines within the Azure portal itself, This capability will make it easier than ever to migrate your Windows Server virtual machines.”

Other features include automated protection and replication of virtual machines, remote monitoring, custom recovery plans, recovery plan testing, and more

SQL memory

April 10th, 2017

While I am a big fan of maximizing memory its important to consider your memory configuration!
You add RAM in a physical server and expect it to work as you want.
Anything that leverages lots of RAM to function, including a database server, can take a substantial performance hit on performance.
Depending on the DIMM configuration, you might slow down your memory speed, which will slow down your application servers.
This speed decrease is virtually undetectable from the OS.
An example : To configure 384GB of RAM on a new server.
The server has 24 memory slots.
• You could populate each of the memory slots with 16GB sticks of memory to get to the 384GB total.
• Or, you could spend a bit more money to buy 32GB sticks of memory and only fill up half of the memory slots.
• Your outcome is the same amount of RAM.
• Your price tag on the memory is slightly higher than the relatively cheaper smaller sticks.
In this configuration, a 16GB DIMM configuration runs the memory 22% slower than if you buy the higher density sticks.

Check out page 63 of the server build guide for an HPE Proliant DL380 Gen9 server. https://www.hpe.com/h20195/v2/getpdf.aspx/c04346247.pdf

The fully populated 16GB stick configuration runs the memory at 1866 MHz.
When you only fill in the 32GB sticks on half the number of slots, then the memory runs at 2400 MHz.

SQL Server dynamically acquires and frees memory as required. Typically, an administrator does not have to specify how much memory is allocated to SQL Server. However, the max server memory option can be useful in some environments. Make sure that sufficient memory is available for the operation of Windows Server. . For example, make sure that you run a dedicated instance of SQL Server on a server that has at least 4 gigabytes (GB) of memory. If the available memory for the server drops below 500 megabytes (MB) for extended periods, then the performance of the server may degrade.

Use the ‘Memory: Available Mbytes performance counter’ for the Windows Server operating system to determine whether the available memory drops below 500 MB for extended periods. If the available memory drops below 500 MB frequently or for extended periods, then we recommend that you reduce the max server memory setting for SQL Server or increase the physical memory of the server.

Dynamics 365 – Licensing and support key dates to be aware. Ask Synergy Software Systems, Dubai

March 18th, 2017

License Renewal & Anniversary Date:

If you are considering an upgrade to Dynamics 365 for Operations, then your license anniversary or enhancement renewal date is significant, You have the opportunity to do a full platform and license transition. There are specific incentives and license credits available to make this transition when you are on a supported product version.

The Mainstream Support End Date of your Current Dynamics AX Software Version:

If you do not opt to move to a cloud-based license model at your license anniversary, then the next important date to consider is the mainstream support end date for the current product you are using:

Support Dates for Existing Dynamics AX On-premise products:
AX 2009, AX 2012 R1 & R2 – Mainstream support ends in April 10, 2018; Extended support is available until October 12, 2021
AX 2012 R3 – Mainstream support ends on October 12, 2021; Extended support is available until January 10, 2023

Why is it important to be on a Mainstream Supported Product?

There are many reasons to be on a supported product :
1.The option to receive support updates and hotfixes – this is the forum in which Microsoft collects bugs and issues and systematically releases fixes, making the platform up-to-date and reliable.
2.Regulatory Compliance ends with Mainstream Support – this means that when your organization is legally obligated to follow regulatory compliance standards, this process will need to be manually completed.
3. Access to Customer resources.

During Extended support, Microsoft provides support for the product and will provide security-related hotfixes.

Your Dynamics Roadmap – Action Plan

So, what should all of these dates mean to you? The answer depends on many things: which version you’re currently using, the range of modules, and customisations, and integrations, your hardware investment, internet connectivity, how you are impacted by statutory changes, economic pressures, and so on..

For those on AX 2009, AX 2012 R1 & R2:

It’s ideal to be on a mainstream supported product, and extended support is available through October of 2021. In either scenario, you need to an action plan to:
◾Decide whether to stay on premise thereafter , or whether to go cloud at some point.
◾Decide to what product you’ll upgrade and whether this will be a one or two step process – AX 2012 R3 or direct to Dynamics 365 operations?
◾Identify requirements for path chosen (Data migration, customizations, process change, short term hardware investments, whether to upgrade SQL or operating systems, etc.)
◾Budget [time and money] for the requirements gathering [can last several months] and the actual upgrade
◾Identify internal/external resources to execute the project
◾Perform and test the upgrade , and train new users.

Most companies want to have the decision and plan in place before mainstream support ends – which is just over a year away – April of 2018 for 2019.

There are other incentives as to upgrading sooner rather than later.

For those on AX 2012 R3:

Like the clients on AX 2009, 2012 R1, & R2 (above), the same decisions must be made.
Do you want to upgrade to Dynamics 365 Operations?
Do you want to stay on premise?

While your mainstream support lasts longer, there are benefits and incentives to consider when deciding on a timeline for your changes.

On-premise vs. Cloud Options:

Until February 2017, existing on-premise clients only had 3 Options:
◾Stay with your Perpetual License on AX 2012 or earlier (keep paying Enhancement or Software Assurance)
◾Upgrade to Dynamics 365 for Operations and move to Subscription Only [Cloud] Model (available at license anniversary/renewal)
◾Upgrade to Dynamics 365 for Operations in a Hybrid Model (Perpetual License + Cloud Add-On)
◾ Move to Dynamics 365 for Operations subscription license but continue to use Dynamics 2012 on premise R3 for some time before moving to Dynamics 365 for Operations (‘equivalence”)

On February 23rd, Microsoft announced a new, hybrid option based on edge computing:
◾Upgrade to Dynamics 365 for Operations, but stay on premise, either with a subscription license or keep a Perpetual License model.

On Monday last week in the Tech Conference , Microsoft announced more details about the new deployment options for Dynamics 365 for Operations that will be available in Q2 2017.

In addition to a pure-cloud environment, organizations can now choose from two options on how this can work .

◾The first is a hybrid deployment (called Cloud and Edge) where critical operations processes, as an example, can remain in an on-premise database, but the power of the cloud can be harnessed for additional scalability.

◾ The second option is, essentially, on premise option. Microsoft calls this option Local Business Data, where Dynamics 365 resides in your existing datacenter.

Investment Credit and Incentives:

Most companies do their budgeting annually, so planning your roadmap should already be underway in 2017. At the Summit Conference last October, Microsoft announced a 40% discount to existing on-premise clients who want to transition to the new Dynamics 365 Cloud Platform. This incentive is active for 3 years.
If you transition in 2017, you’ll be able to leverage 2 years of discounts,
if you transition in 2018, then you’ll only be able to leverage 1 year of discounts
If you do not transition till 2019, you may not get a discount (depending on transition month).

Under Dynamics 365, licensing SKU’s, functionality and license names also changed , so there is an additional consideration to make in how your organization is licensed. Like previous license models, Dynamics 365 for Operations has different user license types – each with different user rights. The more prescriptive you can be for what your users need to access to the more accurate will be your license transition and the more money you can save.

Strategic Planning with Synergy Software Systems

Synergy Software Systems can undertake a license and environment audit to help you understand your high- level options and costs associated with those options. If you feel that cloud has exciting new functionality and integration that you’ve been looking for, use the time you have between now and your next license anniversary/renewal to look at Dynamics 365 for Operations with us and decide if it’s right for you.

If you’re on Dynamics AX 2009, AX 2012 R1 or R2, use the time between now and April 2018 to decide how to best leverage your existing investment in your ERP system in the next supported step in your Dynamics roadmap journey.

If you’d like help to better understand your options then reach out to us on 00971 4 3365589

Dynamics 365 launch event Microsoft Gulf

February 5th, 2017

Last week Synergy staff attended the Dynamics 365 regional launch day. This gave insights into the Microsoft Dynamics solution portfolio. Siegfried Leiner the Principal Program Manager, Dynamics CRM Microsoft gave a ‘deep dive’ key note speech.

The event was kicked off by Samer Abu Ltaif Regional General Manager, Microsoft Gulf and Karim Talhouk
Regional Director, Microsoft Business Solutions, Microsoft Gulf who presented how digital transformation is happening in the Gulf. Steve Plimsoll Chief Digital and Data Officer, PWC, and Harris Mygdalis Deputy CIO, Commercial Bank of Dubai gave further insights.

This well attended event attracted customers with a wide range of requirements. Mobility, analytics, and integration were common themes.

Microsoft organisational changes from 1 February 2017.

January 11th, 2017

Microsoft is combining its Small and Mid-Market Solutions & Partners (SMS&P) and Enterprise Partner Group (EPG) business units in an attempt to streamline business processes. The changes, which will take effect from February 1, will affect its sales, partner, and services teams, and will see both units come together as one under its Worldwide Commercial Business, led by executive vice-president, Judson Althoff. Corporate vice-president of mid-market solutions and partners, Chris Weber, will lead the combined business.

This seems to echo former CEO Steve Ballmer’s 2013 One Microsoft plan. No layoffs are expected

In Australia, Mark Leigh runs the SMS&P business after David Gage resigned from the role. As for its local EPG business, the head of the unit is yet to be filled as Steven Worrall was given the managing director title after Pip Marlow left the company. However, how these changes will affect Microsoft Australia and New Zealand are yet to be determined.

This move follows the recent departure of then Microsoft chief operating officer, Kevin Turner, whose role was not replaced and was split amongst five senior executives including Althof. As part of that restructure, Althoff was handed the Worldwide Commercial Business, focusing on the Enterprise and Partner Group, Public Sector, Small and Midmarket Solutions and Partners, the Developer Experience team, and services.

The company restructure also sees the creation of a new One Commercial Partner business, which combines various partner teams within Microsoft; a unit called Microsoft Digital, which is expected to grow Microsoft’s cloud division; and the merger of its Worldwide Public Sector and Industry businesses. it will be led by former Salesforce vice president and Microsoft’s current Corporate Vice President of Enterprise Partner Ecosystem, Ron Huddleston. The new group called Microsoft Digital will push Microsoft’s current customers and partners to use the company’s cloud programs. Anand Eswaran, corporate vice president of Microsoft Services, will lead that group.

Corporate Vice President of Worldwide Public Sector Toni Townes-Whitley will lead a combined group comprising Microsoft’s Worldwide Public Sector and Industry Businesses

Jeff Teper, who was Microsoft’s corporate vice president of corporate strategy, announced on Twitter last week he now leads the company’s OneDrive and SharePoint teams. It’s a familiar role, as Teper led the group that first built SharePoint for its 2001 launch. The move seems to be the latest to make room for Kurt DelBene, who was brought back to the executive team after retiring in 2013 to help the U.S. government fix the healthcare.gov website. DelBene assumed a new title as executive vice president of corporate strategy and planning in April. (Soon after, Eric Rudder, executive vice president of advanced strategy, and Mark Penn, executive vice president of advertising and strategy, announced they would be leaving Microsoft.)

David Treadwell, a longtime Microsoft executive who oversaw the Windows engineering team, is also on the move. He’s taking an unidentified role in the Cloud and Enterprise group. Treadwell told staff he was reluctant to leave the Windows team, but “when the CEO calls, well, you take that call.”

According to Microsoft’s announcement, Kim Akers and the ISV team, Victor Morales and the Enterprise Partner team, and Gavriella Schuster and the WPG team will all be moving into One Commercial Partner.

Azure – what is it exactly?

January 8th, 2017

You may have recently seen a television commercial for “The Microsoft Cloud,” which featured Healthcare, Cancer Research, and Cybercrime. So, what does this have to do with Microsoft Azure?

Microsoft Azure is the Microsoft product name for the Microsoft Cloud. The names are used synonymously in the technical industry.

The Cloud digital transformational shift, question remains, “What is Azure, and for whom is it meant?”

Azure was announced in October 2008 and released on February 2010 as Windows Azure, and was then renamed to Microsoft Azure in March 2014.

Azure is a cloud computing platform plus, the underlying infrastructure and management services created by Microsoft to build, deploy, and manage applications and services through a global network of Microsoft-managed data centers.

What Microsoft Azure Data Centers?

There are 34 interconnected Microsoft Data Regions around the world with more planned.

Microsoft describes Azure as a “growing collection of integrated cloud services, including analytics, computing, database, mobile, networking, storage, and web.” Azure’s integrated tools, pre-built templates and managed services simplify the task of building and managing enterprise applications (apps).

Microsoft Corp. CEO Satya Nadella calls Azure, “the industry’s most complete cloud — for every business, every industry and every geography.”

The Complete Cloud

For many businesses, their first foray into leveraging cloud software as a service (SaaS) is with Microsoft Office 365, Exchange online for hosted email, or CRM online for managing business and customer relationships. However, the Azure platform is much more than just an online business software delivery platform.

Here are just a few of the things that you can do with Azure:
• Build and deploy modern, cross platform web and mobile applications.
• Store, backup and recover your data in the cloud with Azure-based disaster recovery as a service (DRaaS).
• Run your Line of Business applications on Azure.
• Run large scale compute jobs and perform powerful predictive analytics.
• Encode, store and stream audio and video at scale.
• Build intelligent products and services leveraging Internet of Things services.

Use Azure, and your partner, to rapidly build, deploy, and host solutions across a worldwide network and to create hybrid solutions which seamlessly integrate on premise existing IT with Azure.

Many leverage Azure to protect data and meet privacy standards like the new international cloud privacy standard, ISO 27018, or HIPAA.

Azure customers can quickly scale up infrastructure, just importantly, scale it down, while only paying for what they use.

Azure also supports a broad selection of operating systems, programming languages, frameworks, tools, databases and devices.

Contrary to the perception that Azure is for Windows only, nearly 1 in three Azure virtual machines are Linux.3

Widespread Adoption

More than 80 percent of Fortune 500 companies rely on Azure, which offers enterprise grade SLAs on services. In addition, Microsoft is the only vendor positioned as a Leader across Gartner’s Magic Quadrants for Cloud Infrastructure as a Service (IaaS), Application Platform as a Service (PaaS), and Cloud Storage Services for the second consecutive year.1

What is Microsoft Azure IOT

Microsoft’s powerful Azure Internet of Things Hub and tool suite has also been widely adopted for use in commercial and scientific applications to securely connect and manage Internet of Things (IoT) assets. The service processes more than two trillion IoT messages weekly.4

From broadcasting the Olympics to building massively multiplayer online games, Azure customers are doing some amazing things, and in increasing numbers. Microsoft recently revealed that the rate of Azure customer growth has accelerated to more than 120k new Azure customer subscriptions per month.4 In line with the accelerated adoption, the company is projecting an annualized commercial cloud revenue run rate of $20 Billion in 2018.3

Cloud Leadership

With Azure, Microsoft has made a huge commitment to cloud computing. Since opening its first datacenter, Microsoft has invested more than $15 billion in building its global cloud infrastructure.5 In addition, the company recently announced it would build its first Azure data center in France this year as part of a $3 billion investment to build its cloud services in Europe.6

Microsoft is quickly closing the gap in market share with IaaS provider Amazon Web Services, (AWS). While 37.1% of IT professionals surveyed indicated that Amazon AWS is their primary IaaS platform, Microsoft Azure is a close second at 28.4%, followed by Google Cloud Platform at 16.5%.7

and hot off the press…….
Microsoft isn’t building its own connected car — but it is launching a new Azure-based cloud platform for car manufacturers to use the cloud to power their own connected-car services.

The new Microsoft Connected Vehicle Platform will go live as a public preview later this year.
“This is not an in-car operating system or a ‘finished product’,” Microsoft’s EVP for business development Peggy Johnson writes in this week’s announcement. “It’s a living, agile platform that starts with the cloud as the foundation and aims to address five core scenarios that our partners have told us are key priorities: predictive maintenance, improved in-car productivity, advanced navigation, customer insights and help building autonomous driving capabilities.”

Microsoft also announced that it is partnering with the Renault-Nissan Alliance to bring the new connected-car services to Renault-Nissan’s next-gen connected vehicles. The two companies were already working together on other projects before this, so it’s maybe no surprise that Renault-Nissan is Microsoft’s first partner.
Microsoft is also working with BMW to develop that company’s BMW Connected platform on top of Azure. BMW and Nissan also showed in-car integrations with Microsoft’s Cortana digital assistant at CES this year, so your future car could potentially use Cortana to power its voice-enabled services. For the time being, though, it looks like these are still experiments.

Microsoft has talked about its aim to bring “intelligence” to as many of its services as possible. It has also recently opened up Cortana to third-party developers, so bringing it to its connected car platform is a logical next step (and we’re also seeing Amazon doing the same thing with Alexa, ).

Johnson also used today’s announcement to take a thinly veiled swipe at Google/Alphabet, which spun out its self-driving car unit a few weeks ago. “As you may have gathered, Microsoft is not building its own connected car,” she writes. “Instead, we want to help automakers create connected car solutions that fit seamlessly with their brands, address their customers’ unique needs, competitively differentiate their products and generate new and sustainable revenue streams.”

Powershell – why?

December 19th, 2016

What is the point of PowerShell?

It handles any task that requires scripting and gives power back to the user, developer, or administrator. Power Shell is a tool that is at a very high level of abstraction, and thus can quickly provide a means of getting a task done by creating a chain of software tools without resorting to writing a compiled application.

PowerShell is an: extensible, open-source, cross-platform object-oriented scripting language that uses .NET.

It can use: COM, WMI, WS-Management and CIM to communicate with, and interact with, any Windows-based process.
It can execute scripts on either a local workstation or remotely.

It is ideal for automating all sorts of processes, and is simple enough to manage your workstation, and yet robust enough to manage SQL Azure.

It will evolve to become the built-in batch processing system in future versions of Windows.

It is important as a configuration management tool and task automation tool, and is versatile enough to be used as a general-purpose programming language.

How did PowerShell come about?

Unix inherited from its mainframe ancestors the use of the batch and its script.
The use of the script allowed UNIX to develop a group of specialized applications that did one job and did it well.
Data could be passed into an application through its standard input, and the results passed to the standard output which meant that data could be streamed like items on a conveyor belt.

It was like building repeatable processes out of Lego like blocks of code.

Scripting did more than encourage piped streams of data. It also encouraged batches and command-line configuration of machines and services. This made it easy to use Unix for servers, because all administration tasks could be scripted.

Scaling up to large groups of servers was smooth since everything was in place to allow it to happen.

Scripting also made the management of servers more precise and error-free. After the script was developed, no further work was needed. The script would do the same thing in the same order with the same result. It made operations work a lot easier.

Think tKorn Shell, with ideas from Bash shell.

Why didn’t Windows have a powerful scripting language like Korn?

Unlike the contemporary UNIX workstations, The first PCs had no pretensions to host server processes. Those were low-specification affordable personal computers and initially conquered the market previously occupied by dedicated word processors, before becoming ubiquitous with the invention of the spreadsheet.
They had the ability to run batches, but this was intended merely to ease the task of installing software. Scripting just seemed old-fashioned or best left to dos.

Microsoft DOS could and did run batches from the command processor, and autoexec.bat is still there in Windows (called AUTOEXEC.NT and located in the %SystemRoot%\system32 directory).

After MSDOS borrowed from the UNIX clone Xenix, this command processor took on some of the features of UNIX shells such as the pipe, but with limited functionality when compared to the UNIX shells.

Microsoft Windows was originally booted from the command processor, and , in later editions, it took over the tasks of the operating system and incorporated the old MSDOS command-line interface tool (shell).

The features of the batch were sufficient to allow it to do a lot of configuration, installation and software maintenance tasks. The system wasn’t encouraged or enhanced after Xenix was abandoned, but remained a powerful tool. Xenix’s replacement, Windows NT or WNT (add a letter to DEC’s VMS to guess its parent.) did not have anything new for the command processor, and inherited MSDOS’s enhanced version from MSDOS 3.3.

This Batch language still exists in the latest versions of Windows, though it is due to be deprecated. It has had quite a few enhancements over the years but essentially what came into MSDOS is still the basis of what is currently shipped. It has a major failing within a Windows environment that it cannot be used to automate all facets of GUI functionality, since this demanded at least COM automation, and some way of representing data other than text.

There have been attempts to replace the DOS batch file technology, including VBS and Windows Script Host (1998), but PowerShell is by far the most effective replacement.

Why did it take so long to get PowerShell?

Maybe because Microsoft under Bill Gates retained the vision that BASIC should remain the core language for Windows scripting and administration …

Basic scripting driving COM automation

Today the office applications still have, underlying BASIC scripting that can be controlled via COM automation. To keep batches consistent with this, the tasks done by batch scripting were to be done by Visual Basic for Applications: VBA. This was supplied with the operating system to drive all automation tasks. Wang’s Office system, similarly, was automated and scripted via Cobol!

Language -driven development and divergence

Over time each office application developed a slightly different incompatible dialect and could not be kept in sync. Visual Basic was inadequate for the task and evolved into vb.net, a somewhat comical dialect of Java. It proved to be unpopular.. VBA was never quite consistent with the Visual Basic used for building applications.

Windows Script Host

Windows Script Host was introduced as an automation and administration tool to provide automation technology, primarily for Visual Basic and JavaScript. it supported several interpretive languages such BASIC, Perl, Ruby, Tcl, JavaScript, Delphi and Python. Initially, it had security loopholes finally solved with digital signing in Windows XP. It is still installed with MS Windows and still provides a number of useful COM interfaces that can be accessed in PowerShell and any other application that can interact with COM.

Windows Script Host was, designed before .NET so it is not able to directly use the .NET library. It also does use WMI, WS-Management and CIM for administration and monitoring. It focused on manage the platform by using very low level abstractions such as complex object models, schema, and APIs. Although it was useful for systems programming it was almost unusable for the typical small, simple and incremental task that is at the heart of administration, which needs very high levels of abstraction.

Microsoft competes in the server market

Microsoft was focused on the desktop market for a long time, so maybe did not realize the scale of the problem to compete in the server market. The GUI-centric Microsoft culture and ecosystem, idea was that all configuration was a point-and-click affair. OK for one or two servers, but not so easy or error free for a server-room.

PowerShell

Due to the determination and persuasive powers of Jeffrey Snover, Microsoft belatedly woke up to the fact that it hadn’t a viable solution for the administration of a number of servers in a medium sized company.
The GUI didn’t scale, and the batch system of the command line, though useful, was stuck in mid-eighties time-warp.

Microsoft had to replace the command line; so it needed all the things it and other interactive shells had, such as aliases, wildcard matching, running groups of commands, conditional running of groups of commands and editing previous commands.
It also had to replace VBA, and to integrate easily with Windows Management Objects.
It had to take over the role of VBA embedded in applications to make automation easier.

Microsoft needed something that looked both backwards and forwards, i.e an industry standard shell backward compatible with the command Line.

PowerShell started with the POSIX standard shell of IEEE Specification 1003.2, the Korn Shell, which is also available in Windows. However, this dealt only with strings, so it had to be altered to also deal with objects so that it could access WMI, WS-Management, CIM and COM. Because it needed so much connectivity and data interchange, it had to be able to use the .NET library to process NET objects and datatypes.

So a new system also needed to understand .NET to utilize the man-years of work of providing a common object model able to describe itself, and that can be manipulated without converting either to or from text. The new scripting system had to be resolutely object-oriented.

So the new PowerShell needed the ability to use any .NET object or value.

PowerShell, was given an intuitive naming convention based on the verb-noun pair, with simple conventions such as ‘Get’ to get an object and a noun describing the object.

To replace the command line Powershell had to be better. The whole point of a command shell is that it must be convenient to type short commands into it e.g. like ‘REPL’ in Python. Powershell also needs to work with existing terse DOS command-line commands so that an expert can type in very truncated commands.

PowerShell was also to be used in scripts stored on disk and repeatedly invoked, with just a change in parameters. This also meant that it had to be easy to read, with intuitive commands and obvious program flow.

It wasn’t an easy compromise, but it was done by means of aliases. Aliases also helped to ‘transition’ users from the other shells they were using to PowerShell (For CMD.EXE it is dir, type, copy etc, for UNIX ls, cat, cp etc.) You can even define your own in Power Shall!

Powershell took an idea from.NET everything should be learnable by discovery, without needing documentation. All the objects and Cmdlets in Powershell are self-documenting in that you can use PowerShell to find out what they do, what functions can be called, and parameters.

Why is the PowerShell Pipeline Important?
The pipeline in PowerShell inherited the concept of a pipe from UNIX. The PowerShell team had to solve the problem of dealing with Windows Management Objects and Instrumentation by passing objects rather than text down the pipe.

Having done so, it found itself in possession of a radical and extraordinarily useful system. It had the means of processing objects as though they were on a conveyor belt, with the means of selecting and manipulating each one as it passed down the pipeline.

This made the code easier to understand and also helped with memory management. A long file could be passed down a pipeline, line-by line, for example, searching for text, instead of having to read the entire file into memory (you can do that too if you want, and if you have no fear of the large object stack; you have to do it if you want, for example, to order the lines). . It also meant you needed only one cmdlets for selecting things, one for sorting, one for grouping , and only one for listing things out in a table. PowerShell could do a lot in a line of code, far, far, more than C# could.

Suddenly, the task of tackling the huge range of data on the average server that one might need to know about was less frightening. It was already there, and was now easy to get at and filter.

Why is PowerShell useful?
Scripts don’t require special components.

PowerShell now has all the power of a compiled .NET language. Automating processes using the windows command line needed many existing command files to determine settings and to configure. This meant that, the developer often had to write components in a compiled language. In developing scripts, part of the time was spent making small commands. This isn’t necessary in PowerShell thanks to .NET.

PowerShell simplifies the management of hierarchical data stores. Through its provider model, PowerShell lets you manage data stores such as the registry, or a group of SQL Servers using the same techniques of specifying and navigating paths that you already use to manage files and folders.

This doesn’t turn PowerShell into a rival to C#, VB-Net, ActionScript or F#.

It is not for developing applications but for automating administrative tasks.

It is theoretically possible to write a webserver in PowerShell, or an interactive GUI using Windows Presentation Foundation but that is not its purpose.

What is PowerShell’s main use?

Traditionally, the command line was used for complex deployments. BPowerShell can work remotely on any computer in the domain, and give far more information about the computer. It quickly became the default means of deployment for Windows.

This is great for the developer. He develops his package in NuGet and can use Chocolatey to deploy it. Linux allows you to install a package with just one simple command. Chocolatey does the same, but also allows you to update and uninstall the package just as easily. A simple script can grab the latest source from Git, compile it, install it and any dependencies, and do any special configuration tasks. There are 4,511 packages you can install from the Chocolatey site. PowerShell now has its own package manager but the current incarnation isn’t as versatile as Chocolatey.

Server Administration.

The release of PowerShell was most welcomed by the server teams.
The Microsoft Exchange Server team were early adopters and used PowerShell to allow the administration of Exchange.
The Microsoft SQL Server team, and Active Directory team also followed suit.
These teams provided specialized Applets that covered all aspects of the administration of the server.

Windows Server now has the capabilities of using Hyper-V to provide a ‘private cloud’ which allows companies to allow a degree of ‘self-service’ for server resources – all driven and maintained by PowerShell

Provisioning.

Provisioning is one of the areas where PowerShell excels.

PowerShell’s DSC package allows a PowerShell script to specify the configuration of the machine being provisioned, using a declarative model in a simple standard way that is easy to maintain and to understand.

It can either ‘push’ the configuration to the machine being provisioned, or get the machine to ‘pull’ the configuration.

Chocolatey, a PowerShell script, can not only install a large range of software, but also update it or remove it.

PowerShell has a built-in system called ‘PackageManagement’ that isn’t so versatile, but which allows you to install packages from a wider variety of sources.

Use PowerShell within an application or website
As well as providing a scripting environment, PowerShell can be embedded into an application by using System.Management,Automation , so that the user of the application can extend it via scripts. You can even do this in ASP.NET

Parallelism and workflow
Although PowerShell is an interpreted dynamic language (using .NET’s DLR) , its performance is enhanced by its ability to run parallel processes and to be able to run asynchronously. It is also designed to be able to run securely on other machines, remotely, and pass data between them. All this is possible without Workflow.

Scripted processes that have complex interdependencies, need to be interruptable and robust, and that is supported by PowerShell workflow.

Workflow can be complicated, and will always be a niche technique for scripting. . it is now able to run complex workflows within a domain thereby making it possible to script even the most difficult of business processes that contain long-running tasks that require persistence and need to survive restarts and interruptions.

PowerShell uses the Windows Workflow Foundation (WF) engine. A PowerShell workflow involves the PowerShell runtime compiling the script into Extensible Application Markup Language (XAML) and submitting this XAML document to the local computer’s Workflow Foundation engine for processing.

PowerShell Workflow scripting is particularly useful in high availability environments for processes such as ETL (data Extraction, Transform and Load), that potentially requiring throttling and connection pooling, and it is ideal where data must come from a number of sources and be load in a certain order.

SQL 2016 Sp1- this is a big deal – Synergy Software Systems

November 19th, 2016

In addition to a consistent programmability experience across all editions.

SQL Server 2016 SP1 also introduces all the supportability and diagnostics improvements first introduced in SQL 2014 SP2, as well as new improvements and fixes centered around performance, supportability, programmability and diagnostics based on the learnings and feedback from customers and SQL community.

SQL Server 2016 SP1 also includes all the fixes up to SQL Server 2016 RTM CU3 including Security Update MS16–136.

SQL editions have traditionally been differentiated by features- this meant that essential features for day to day database use were not present in express or standard versions. Our view is that this is not desirable and that ther is core set of features needed in all editions, and that differentiation should be more about hardware size and resource supported.

Well Sql 2016 sp1 now brings us close to that wish so its a really big deal for the SMB and mid market customer.

Once you have an application using SQL Server 2016 Standard Edition, you can just do an Edition Upgrade to Enterprise Edition to get even more scalability and performance, and take advantage of the higher license limits in Enterprise Edition. You will also get the intrinsic performance benefits that are present in Enterprise Edition.

The table compares the list of features which were only available in Enterprise edition, which are now enabled in Standard, Web, Express, and LocalDB editions with SQL Server 2016 SP1. This consistent programmatically surface area allows developers and ISVs to develop and build applications leveraging the following features which can be deployed against any edition of SQL Server installed in the customer environmen

This is a bold move by Microsoft, and should increase Standard sales, and customer satisfaction, without cannibalizing Enterprise sales. Standard Edition customers can use these features both to consolidate their codebases and, in many scenarios, build solutions that offer better performance.

There are many of new features available across all editions of SP1.
There still differences in Enterprise:

Availability features like: online operations, piecemeal restore, and fully functional Availability Groups (e.g. read-only replicas) are still Enterprise only.

Performance features like parallelism still don’t work in Express Edition (or LocalDB).

Automatic indexed view usage without NOEXPAND hints, and high-end features like hot-add memory/CPU, will continue to be available only in Enterprise.

Operational features like: Resource Governor, Extensible Key Management (EKM), and Transparent Data Encryption will remain Enterprise Edition only.

Others, like Backup Encryption, Backup Compression, and Buffer Pool Extension, will continue to work in Standard, but will still not function in Express.

SQL Server Agent is still unavailable in Express and LocalDB. As a result, , Change Data Capture will not work. Cross-server Service Broker also remains unavailable in these editions.

In-Memory OLTP and PolyBase are supported in Express, but ere unavailable in LocalDB.

Virtualization Rights haven’t changed and are still much more valuable in Enterprise Edition with Software Assurance.

Resource limits on the lower level editions remain the same. The upper memory limit in Standard Edition, is still 128 GB (while Enterprise Edition is now 24 TB).

I feel that Standard Edition is expensive enough that its memory limits should never be so dangerously close to the upper bound of a well-equipped laptop and maybe we should expect the limit to increase at least with each new version. If you when you are on Standard Edition and scale is required, then you can now use many Enterprise features across multiple Standard Edition boxes or instances, instead of trying to scale up.

All the newly introduced Trace flags with SQL Server 2016 SP1 are documented and can be found at http://aka.ms/traceflags.

SP1 contains a roll-up of solutions provided in SQL Server 2016 cumulative updates up to and including the latest Cumulative Update – CU3 and Security Update MS16–136 released on November 8th, 2016. Therefore, there is no reason to wait for SP1 CU1 to ‘catch–up‘ with SQL Server 2016 CU3 content.

The SQL Server 2016 SP1 installation may require reboot post installation

Microsoft Dynamics 365 now available in the U.A.E. – ask Synergy Software Systems

November 1st, 2016

Microsoft Dynamics 365 is a suite of cloud services to help companies to accelerate their digital transformation with purpose-built apps to address specific business needs.

Dynamics 365 unifies CRM and ERP functions into applications that work smoothly together across all divisions: sales, customer service, field service, operations, financials, marketing, and project service automation. These apps can be easily and independently deployed and scaled on demand.

Start with what you need the most.

All apps are delivered through easy-to-use, mobile experiences and feature offline capabilities.  

Users can rely on Power BI, Cortana Intelligence and Azure IoT functions which are natively embedded.

In addition to that, Dynamics 365 and Office 365 are deeply integrated.  Since Dynamics 365 uses a new common data servicel, customers can extend functionality and build custom apps using PowerApps, Microsoft Flow (News: PowerApps and Flow available) as well as professional developer solutions.

To find out more call us now on 00971 43365589.

Azure Analysis Services now in Preview

October 30th, 2016

This past week Microsoft announced Azure Analysis Services at the PASS Summit. This is the evolution of SSAS on the SQL Server platform, with the ability to now move your tabular models into Azure and run those on an as needed basis. This means that you don’t need to administer your own SSAS instance, and can connect to cloud and on-premise data sources for your data analysis needs.

Since SQL Server has moved its codebase to primary development in Azure and a periodic release on-premise, this is a good sign that Analysis Services will continue to receive investment in the future. There are restrictions e.g. no multidimensional models so to test out SSAS in Azure, understand that limitation and that this is still in preview. Expect this platform to evolve and update at least quarterly, with new features and fewer restrictions over time. You can get started quickly.

As with other services in Azure, I both like and dislike some things. It’s great that the platform evolves and changes quickly, but I’d like to know which release of Azure Analysis Services I’m using. Not every evolution is helpful to all, and some will break systems, so knowing there has been a change or a new release can dramatically speed up troubleshooting. When the version of the Azure system changes, we know then to look at Azure release notes rather than our own code.

SSAS has still not become as popular than we might have expected 17 years ago, . SSAS is still alive and well, moving into the cloud and receiving development resources. . There are problem domains that are addressed well by SSAS and the ability to use the technology as an on-demand platform, without adding additional administrative and hardware resources is welcome.

A fundamental conceptual understanding of SSAS and MDX still escapes most people. Nowadays we have many e tools that can read data from SSAS instances and help users query data. That means we need fewer people and less time to design and to maintain SSAS instances, and also that we can easily create and destroy those as needed on Azure.

SQL Cumulative updates for September 2016 –

September 24th, 2016

Cumulative update 14 release for SQL Server 2012 SP2 is now available for download at the Microsoft Downloads site. Please note that registration is no longer required to download Cumulative updates.
CU#14 KB Article: https://support.microsoft.com/en-us/kb/3180914
Microsoft® SQL Server® 2012 SP2 Latest Cumulative Update: https://www.microsoft.com/en-us/download/details.aspx?id=50731
Cumulative update 5 release for SQL Server 2012 SP3 is now available for download at the Microsoft Downloads site. Please note that registration is no longer required to download Cumulative updates.
To learn more about the release or servicing model, please visit:
• CU#5 KB Article: https://support.microsoft.com/en-us/kb/3180915
• Microsoft® SQL Server® 2012 SP3 Latest Cumulative Update: https://www.microsoft.com/en-us/download/details.aspx?id=50733
• Update Center for Microsoft SQL Server: http://technet.microsoft.com/en-US/sqlserver/ff803383.aspx
-
Microsoft also just announced that Cumulative Update 2 for SQL Server 2016 is now available for download here. There are a number of fixes including one regarding the Query Store. See here – you should seriously consider if you are running QSL Server 2016.

Stephen Jones
Director
Synergy Software Systems
009714 3365589
Visit our active blog site for news and the latest product information
www.synergy-software.com/blog
Microsoft Award – Highest Customer Satisfaction 2014
Microsoft President’s Club 2015

Gartner shows Microsoft Azure as a cloud leader for the third succesive year

September 23rd, 2016

Gartner has recently identified Microsoft Azure as a leader in the analyst firm’s Magic Quadrant for Cloud Infrastructure as a Service (IaaS), for the third year in a row, both based on both completeness of vision and the ability to execute.

Microsoft’s Azure cloud platform enables the creation of virtual networks, servers and machines, and supports multitenant storage, object storage and a robust content delivery network for both Microsoft and other vendor solutions. Azure also provides advanced services such as machine learning and Internet of things.

The Azure infrastructure has security integrated from the ground up, and all data, whether at rest or in transit, is strongly encrypted. All of offerings are supported by a leading-edge Cyber Defense Operations Centre that monitors customer infrastructure around the clock.

Gartner’s announcement comes at a time when the Gulf region is taking strident steps towards cloud infrastructure adoption. Saudi Arabia plans to invest $2 trillion in IT projects in the coming years, with a significant portion to be invested in cloud. Meanwhile, the United Arab Emirates will see a gradual growth in IT spend from now until 2020, according to a report from BMI Research. A compound annual growth rate (CAGR) of 3.4 per cent is expected.

An accompanying decline in hardware sales together with BMI’s prediction that SaaS will take an increasing share of software sales, and strongly indicates a decisive shift to cloud for the GCC.

When Microsoft announced the G series of virtual machines, back in Q1 of 2015, it represented the most memory, highest processing power and the largest local SSD capacity of any VMs then available in the public cloud. The G series, allowed Azure to lead the market with continued innovation also supporting SAP HANA workloads up to 32 TB. Azure has industry-wide recognition too for its support of Linux and other open-source technologies having nearly one third of all Azure VMs on Linux boxes.

Gartner’s report singled out Microsoft’s “rapid rollout” of these new features and many others, signaling that the company’s brand and history, both with its customers and with its delivery of enterprise-class solutions and services, combine to allow the company to ‘rapidly attain the status of strategic cloud IaaS provider’.
“Microsoft Azure encompasses integrated IaaS and PaaS components that operate and feel like a unified whole,” Gartner analysts wrote.

Microsoft has been rapidly rolling out new features and services, including differentiated capabilities. It has a vision of infrastructure and platform services that are not only leading standalone offerings, but also seamlessly extend and interoperate with on-premises Microsoft infrastructure (rooted in Hyper-V, Windows Server, Active Directory and System Center), development tools (including Visual Studio and Team Foundation Server [TFS]), middleware and applications, as well as Microsoft’s SaaS offerings.”

Gartner’s analysts also cited Microsoft’s “deep investments” in engineering and “innovative roadmap” as crucial factors in the company’s current IaaS market standing. The report further recommends Microsoft Azure for General business applications and development environments that use Microsoft technologies; migration of virtualized workloads for Microsoft-centric organizations; cloud-native applications (including Internet of Things applications); and batch computing.

Microsoft is the leading platform and productivity company for the mobile-first, cloud-first world, and its mission is to empower every person and every organisation on the planet to achieve more.

Microsoft Gulf opened its Dubai-based headquarters in 1991 the same year as Synergy Software Systems.

For cloud hosting, or to back up to the cloud, or for applications like Dynamics 365 or Ax RTW (7) or Synergy MMS, or our xRM HIS, or Imaging storage, and Document management, or for cloud based monitoring of your clouds and on premise networks, find out how we can help with your move to the cloud.

Cloud – IaaS,PaaS,SaaS – what does this mean for Microsoft Dynamics?

September 20th, 2016

•<IaaS – Infrastructure as a service shares a huge gird of raw computing power and storage, including databases, rules engines, processing power and other infrastructure capabilities. A cloud provider makes this power accessible on an “as needed” basis. Its usually charged as a utility – you pay for what you use. for many comanies this is a way to outsource much of their day to day IT server management and strategy. The cloud provider takes care of hardware and database and operating systems upgrades, patches. log management, database tuning and expansion, and support, together with the associated overhead cost of energy, staff, server room – (no fit out, space to rent,n or A/c needed no separate maintenance contracts, no need for back up and anti virus software, etc)
PaaS - Platform as a service. This allows developers to access tools to create their own applications. The building blocks are made available by the software vendors to provide a jump start on development. PaaS is what your supply chain IT support or third party consultants can use to create customized workflows or tools specific to your needs. This may also for example be a temporary environment for testing
•SaaS – Software as a service. The business process application (“solution” in vendor-speak) layer in the cloud. Users can ‘rent’ or subscribe to applications on a per use basis to tackle specific business issues. SaaS is what you log on to use without any customization. Some vendors may also offer a rent to buy option.

Dynamics CRM is available on the cloud as part of the Office 365 suite ,and also on premise.

Ax can be run on premise, or the on premise licenses can be hosted on a cloud such as Azure i.e. IaaS charges are based on the server and database option selcted, the amount of storage space , the number of environments and a per month cost based on usage.
Both CRM and Ax are also available on a SPLA i.e rental licence basis/per user role either premise or on a hosted platform.
Third party hosting providers may offer a fixed priced SPLA and IaaS offering, for example Synergy Software Systems provides this option via our partner SaaS Plaza.

The current version of AX, released under the codename “AX 7″, or sometimes as Ax RTW provides both infrastructure and platform as a service, and you could also say software as a service for many businesses. It is currently only available as a cloud offering with a minimum of 20 Enterprise users or equivalent (this minimum number was reduced to 20 this month)

Note Historically AX has been a favored development platform and is often bought with the purpose of customizing the logic to match the exact needs of the business, instead of a company attempting to adjust their business processes.

The next step will be Dynamics 365 which will launch in the USA in November this year and we expect to see in the U.A.E. mid 2017. This will be an integrated solution with a common database, common data layer (out of the box master data integration), office productivity (Word Excel …) and communication tools (Skye Business,) collaboration tools )SharePoint, Yammer..) CRM, A Business logic layer which until recently was called ‘madeira’ this will extend to include sales, purchases finance … and will be an excellent solution of the SMB sector and medium size businesses. For the Enterprise customer Dynamics Ax 7 will also be an option finally set of tools will work across this solution stack Power Apps (think Xamarin, AsxStudio) Power BI, Microsoft Flow etc. And that’s not all there will also be an apps store and it will be easy for developers to push new apps into the store.

we will be covering lot more in this topic in coming months.

So the cloud is coming and offers more options than ever. Many companies will have a mix of on premise and cloud based solutions for the foreseebale future.Azure technology stack is due mid next year and will enable on premise Ax 7 as will as other hybrid cloud options There are many other azure solutions, e.g. for cloud based back up, or Cortana Analytics for Bi or the Azure IoT stack.

All the major IT vendors are now focused on the cloud. As the world becomes more mobile, and business models more disruptive expect its adoption to accelerate -we already mass use cloud systems One note, drop box, Facebook Linked-in Google apps, Hotmail, You tube, Vimeo,…. and many mobile apps already in our personal lives and the new ‘generation z’ employees now entering the job market expect the same power, agility and simplicity f se in their work tools.
So if the cloud, social and mobile are not part of your IT strategy then you have already ‘missed the bus’.

Why the ‘cloud’? What is a hybrid cloud? Ask Synergy Software Systems, Dubai

September 19th, 2016

Buy or rent?. On premise or SaaS.? The answer to the questions, for enterprise computing, goes in cycles. When mainframe computing was at its peak, many organizations did not own such expensive machines outright and many companies rented processing time on these machines when needed, an arrangement known as time-sharing.
Moore’s law changed that. The era of mini — and then micro — computing made processing power so cheap that many organizations chose to own. As enterprise computing infrastructures became more complex, and the cost and difficulty of finding expert IT staff increases, so renting or subscription as it now called, has come back into vogue once more, in the form of Software-as-a-Service (SaaS) and cloud computing

The terms “cloud” and “data center” may sound like interchangeable technical jargon or trendy buzz words. A data centre is ideal for those companies that need a customized, dedicated system that gives them full control over their data and equipment. Typically those with many integrations, and uncertain internet connections, and an internal IT team will consider this route. Since only the one company will be using the infrastructure’s power, a data centre is suitable for organizations that run many different types of applications and complex workloads.

A data centre, however, has limited capacity — once you build a data centre, you will not be able to instantly change the amount of storage, or processing power to accommodate for example significant changes in workload and data processing. On the other hand, a cloud system is scalable to your business needs. It has potentially unlimited capacity, based on your vendor’s offerings and service plans. When you are looking at big data processing for predictive analytics, of have high day end or seasonal workloads, then the ability to ramp up and down is important to avoid oversizing. For project based companies both the number of user licences required, and the processing power may vary from year to year. For a rapidly expanding company hardware and server room expansion and management is a challenge on premise.

In a recent IDC (International Data Corporation) Multi-Client Study, CloudView 2016) respondents to the survey said that they expect to increase their cloud spending by approximately 44% over the next two years, and 70% of heavy cloud users are thinking in terms of a “hybrid” cloud strategy.

The idea of a hybrid cloud is to get the best of on-premise deployment by leveraging cloud services. Some work is done on premise, some on the cloud e.g. BI or payment gateway. A combination of both public and private platforms, a hybrid cloud is meant to provide organizations with greater IT and infrastructure flexibility, as well as visibility and control over their cloud usage. The result should be that a, hybrid cloud enables business agility, including streamlined operations and improved cost management.

Sounds good but what does it all mean and what are the challenges? First let’s review some of the basics concepts.

Public Cloud
A public cloud is one in which the services and infrastructure are provided off-site, over the Internet. Data centre hardware is not owned by clients and so you face no capital expenses. Instead, providers sell hosting as a ‘utility’ or rental service. Providers offer maintenance, disaster recovery and backup, however basic this may be. This is typically a multi-tenant software solution. Individual company data sits in separate blocks in a common clustered hardware. Data for individual organisations is kept separate and protected with robust security. Breaches of data with a reliable provider are rare. However, some security standards are not suitable for very sensitive data, rigorous audit trails or industry-specific compliance.

A Public cloud is y used to host web servers or develop applications. It is attractive to small and mid-sized enterprises (SMEs) when they are happy to use out-of-the-box menu specifications. Virtual machines are configured quickly – often within hours. Some SaaS (Software as a Service) services are placed within a public cloud when they have high levels of built-in security.

Private Cloud
A private cloud is one in which the services and infrastructure are maintained on a private network. It operates on an isolated network and is extremely secure. It keeps data behind a firewall and is built either on-premise or in a ring-fenced section of a data centre. A Private cloud is a single tenant solution, with the hardware accessed by one, or multiple businesses. It’s an ideal solution for enterprise organisations or specialist firms with high levels of security and compliance. Clients generally maintain their own cloud system and own their hardware.

Security and compliance on private cloud is configured to meet compliance standards. Private cloud systems cost much more than public cloud and re-configuring is more complex and lengthy.

Hybrid Cloud
Hybrid cloud uses public and private cloud for different elements of computing. Only some elements will require high security and customisation but others will not. Hybrid cloud offers private cloud for sensitive data but keeps non-sensitive, generic data (e.g. customer literature) in a cheaper public cloud environment. Hybrid cloud is usually hosted by different cloud providers – one for public and one for private. Hybrid cloud benefits companies who experience seasonal spikes so extra computing power is deployed quickly and cheaply in public cloud while keeping sensitive information in its private cloud.

A Hybrid cloud is the biggest growth area in cloud computing for enterprise businesses. As servers become ‘smarter’, hybrid cloud is estimated to represent 75% of future enterprise cloud computing.

A Hybrid cloud does not mean failover to onsite, for which a failover solution or a clustered install is needed and the failover can be to any other site whether local, remote or on cloud. Nor does hybrid mean offline working on premise option.

IBM’s Institute for Business Value (IBV) polled more than 1,000 C-level executives to reveal that 78% of respondents deploy a cloud initiative that is fully integrated or coordinated — an increase from 34% in 2012. Enterprises may be embracing the cloud, but they are not yet fully invested in a cloud-only strategy. Across 18 industries, 45% of workloads are expected to remain on-premise in the near future.

A hybrid cloud deployment is a collaboration of public cloud, private cloud and traditional IT platforms that allow enterprises to customize a cloud solution that meets the particular needs of their company. The top motivating factors for adopting hybrid cloud solutions, according to the IBM study, include lowering the total cost of ownership, facilitating innovation, improving efficiency and meeting customer expectations.

Among the companies that embrace cloud computing, 76% responded that they were able to expand into new industries, 71% created new revenue sources and 69% supported new business models.

Security remains a concern, however, and has become a hurdle for companies and a deterrent from fully investing in the cloud. Nearly half of respondents expressed that security and compliance risks are a challenge in IBM’s study, while 41% of respondents expressed that the cost of the cloud was a deterrent and 38% feared a disruption to company operations by introducing a new cloud solution.

When survey respondents are segmented by performance, IBM concludes that twice as many high performers have fully integrated their cloud initiatives compared to low performers.

Nati Shalom, recently discussed in his post Achieving Hybrid Cloud Without Compromising On The Least Common Denominator, a survey that demonstrates that enterprises these days are often leveraging as many as six clouds simultaneously, and the list just keeps on growing with new technologies sprouting up by the minute. IT markets are not just moving to the cloud — they are moving to ‘clouds’,” said Ed Anderson, research vice president and Sid Nag, research director at Gartner in their report: “Market Trends: Cloud Adoption Trends Favor Public Cloud With a Hybrid Twist,” published August 4, 2016. “Evidence is mounting that as organizations mature in their usage of cloud services they are opting to use multiple cloud services, bound together through hybrid implementations.”

That’s why solutions like the Azure Stack, that are also geared towards multi-cloud scenerios in the context of app migration to the cloud from traditional data centers, especially while taking all of the enterprise-grade considerations involved in such a transition into account, are critical.

Many solutions don’t provide the extensibility and interoperability that enterprises need for future-proofing, application deployment portability among other popular use cases across clouds. Hybrid cloud itself has also has proven that it isn’t immune to future proofing with disruptive technologies arising every day

Azure users now have a set of building blocks for managing the entire application stack and its lifecycle, across clouds, stacks and technologies. And with Microsoft now having the most open source developers on GitHub, yup – ahead of Facebook, Angular, and even Docker – Azure is uniquely positioned to achieve this level of openness and interoperability.
This will also ultimately provide a higher degree of flexibility that allows users to define their own level of abstraction per use case or application. In this manner, cloud portability is achievable without the need to change the underlying code, enabling true hybrid cloud.

Fifty-five percent of CIOs surveyed by Gartner indicated that by 2020 they will structure more than half of their applications as SaaS or manage them in a public cloud infrastructure. To manage and govern public, private and hybrid cloud services requires a focus on cloud management. This, in turn, requires new roles, processes and technologies.

Key Employee roles for the Hybrid cloud
Database professionals to filter out business critical data from the data overload we have today. A Big Data Foundationprofessional will be familiar with – Hadoop and MongoDB.
Software developers no longer just push code, they are pivotal to the user experience and thus the user adoption of cloud solutions.
Information security managers must appreciate the risks involved with business data and discuss this across the organization (at all levels) to align key stakeholders in positions to invest in and implement security measures.
Enterprise architects. Today solution architects, need the skills to adapt to cloud computing and hybrid cloud environments. Companies want to avoid working with ad hoc systems implementations, and architects who understand cloud computing and all its service models are in high demand. to design a scalable and sustainable cloud infrastructure which optimizes the use of private and public cloud.
Business managers working in the cloud need to understand how the technical infrastructure supports the business strategy get the benefits of cloud computing to drive their objectives.

Microsoft’s Hybrid cloud blog: https://blogs.technet.microsoft.com/hybridcloudbp/2016/09/

If you are considering how the cloud can benefit your business then contact us to explore the many options.

Find out out about the new integrated Dynamics 365 offerings. e.g.
Ask about specific vertical solutions like Synergy MMS for hotel facility management, or 7 Medical HIS and imaging solutions
Host your applications in a secure managed cloud – with both fixed price or based on use billing.
Monitor your on site global networks with cloud based monitoring systems.
Use Cortana Analytics and Power BI to turn data into information.
Back up to the cloud.
Skype Business
and much, much more.