Archive for the ‘Technology’ category

Dynamics 365 – Licensing and support key dates to be aware. Ask Synergy Software Systems, Dubai

March 18th, 2017

License Renewal & Anniversary Date:

If you are considering an upgrade to Dynamics 365 for Operations, then your license anniversary or enhancement renewal date is significant, You have the opportunity to do a full platform and license transition. There are specific incentives and license credits available to make this transition when you are on a supported product version.

The Mainstream Support End Date of your Current Dynamics AX Software Version:

If you do not opt to move to a cloud-based license model at your license anniversary, then the next important date to consider is the mainstream support end date for the current product you are using:

Support Dates for Existing Dynamics AX On-premise products:
AX 2009, AX 2012 R1 & R2 – Mainstream support ends in April 10, 2018; Extended support is available until October 12, 2021
AX 2012 R3 – Mainstream support ends on October 12, 2021; Extended support is available until January 10, 2023

Why is it important to be on a Mainstream Supported Product?

There are many reasons to be on a supported product :
1.The option to receive support updates and hotfixes – this is the forum in which Microsoft collects bugs and issues and systematically releases fixes, making the platform up-to-date and reliable.
2.Regulatory Compliance ends with Mainstream Support – this means that when your organization is legally obligated to follow regulatory compliance standards, this process will need to be manually completed.
3. Access to Customer resources.

During Extended support, Microsoft provides support for the product and will provide security-related hotfixes.

Your Dynamics Roadmap – Action Plan

So, what should all of these dates mean to you? The answer depends on many things: which version you’re currently using, the range of modules, and customisations, and integrations, your hardware investment, internet connectivity, how you are impacted by statutory changes, economic pressures, and so on..

For those on AX 2009, AX 2012 R1 & R2:

It’s ideal to be on a mainstream supported product, and extended support is available through October of 2021. In either scenario, you need to an action plan to:
◾Decide whether to stay on premise thereafter , or whether to go cloud at some point.
◾Decide to what product you’ll upgrade and whether this will be a one or two step process – AX 2012 R3 or direct to Dynamics 365 operations?
◾Identify requirements for path chosen (Data migration, customizations, process change, short term hardware investments, whether to upgrade SQL or operating systems, etc.)
◾Budget [time and money] for the requirements gathering [can last several months] and the actual upgrade
◾Identify internal/external resources to execute the project
◾Perform and test the upgrade , and train new users.

Most companies want to have the decision and plan in place before mainstream support ends – which is just over a year away – April of 2018 for 2019.

There are other incentives as to upgrading sooner rather than later.

For those on AX 2012 R3:

Like the clients on AX 2009, 2012 R1, & R2 (above), the same decisions must be made.
Do you want to upgrade to Dynamics 365 Operations?
Do you want to stay on premise?

While your mainstream support lasts longer, there are benefits and incentives to consider when deciding on a timeline for your changes.

On-premise vs. Cloud Options:

Until February 2017, existing on-premise clients only had 3 Options:
◾Stay with your Perpetual License on AX 2012 or earlier (keep paying Enhancement or Software Assurance)
◾Upgrade to Dynamics 365 for Operations and move to Subscription Only [Cloud] Model (available at license anniversary/renewal)
◾Upgrade to Dynamics 365 for Operations in a Hybrid Model (Perpetual License + Cloud Add-On)
◾ Move to Dynamics 365 for Operations subscription license but continue to use Dynamics 2012 on premise R3 for some time before moving to Dynamics 365 for Operations (‘equivalence”)

On February 23rd, Microsoft announced a new, hybrid option based on edge computing:
◾Upgrade to Dynamics 365 for Operations, but stay on premise, either with a subscription license or keep a Perpetual License model.

On Monday last week in the Tech Conference , Microsoft announced more details about the new deployment options for Dynamics 365 for Operations that will be available in Q2 2017.

In addition to a pure-cloud environment, organizations can now choose from two options on how this can work .

◾The first is a hybrid deployment (called Cloud and Edge) where critical operations processes, as an example, can remain in an on-premise database, but the power of the cloud can be harnessed for additional scalability.

◾ The second option is, essentially, on premise option. Microsoft calls this option Local Business Data, where Dynamics 365 resides in your existing datacenter.

Investment Credit and Incentives:

Most companies do their budgeting annually, so planning your roadmap should already be underway in 2017. At the Summit Conference last October, Microsoft announced a 40% discount to existing on-premise clients who want to transition to the new Dynamics 365 Cloud Platform. This incentive is active for 3 years.
If you transition in 2017, you’ll be able to leverage 2 years of discounts,
if you transition in 2018, then you’ll only be able to leverage 1 year of discounts
If you do not transition till 2019, you may not get a discount (depending on transition month).

Under Dynamics 365, licensing SKU’s, functionality and license names also changed , so there is an additional consideration to make in how your organization is licensed. Like previous license models, Dynamics 365 for Operations has different user license types – each with different user rights. The more prescriptive you can be for what your users need to access to the more accurate will be your license transition and the more money you can save.

Strategic Planning with Synergy Software Systems

Synergy Software Systems can undertake a license and environment audit to help you understand your high- level options and costs associated with those options. If you feel that cloud has exciting new functionality and integration that you’ve been looking for, use the time you have between now and your next license anniversary/renewal to look at Dynamics 365 for Operations with us and decide if it’s right for you.

If you’re on Dynamics AX 2009, AX 2012 R1 or R2, use the time between now and April 2018 to decide how to best leverage your existing investment in your ERP system in the next supported step in your Dynamics roadmap journey.

If you’d like help to better understand your options then reach out to us on 00971 4 3365589

Dynamics 365 will have an on-premise option in June 2017

February 28th, 2017

The future of Dynamics Ax seemed to take a big hit last year when the Dynamics 365 was launched. When Ax7 was launched for the cloud it was understood there would be an on premise version based on a private cloud which would run on the azure technology stack. That proved to be not so easy and a Microsoft Representative at a user group explained that there will be no on premise version and to get used to it.

The future, it seemed was that there will be some on premise components (manufacturing and retail) but on premise was dead.

Customers and partners immediately began shaking their heads in disapproval. We were particularly blunt in our views. The end message perceived by multiple people in the audience, was “to take it or leave it”. The fallout was instant and, a recent poll by Computer Futures showed that 40% of Microsoft Dynamics Operations customers stated they were going to leave if an on premise product wasn’t offered within the next 3 years.

Microsoft claims it listens and that it constantly seeks feedback. That after all is the basis of its agile development model. And it seems they really do with the recent announcement that an On-Premise version is coming in June.

The graphic below was published by Sri Srinivasan on the week of 2/20/2017. The expectation now is that Dynamics 365 for Operations will be generally available on premise in June 2017.

Dynamics 365 for Operations is already available in the cloud and offers added advantage of Microsoft’s intelligence capabilities of embedded analytics, machine learning, or working with the IoT stack.

However for many companies who have already heavily invested in infrastructure, or who unreliable internet connections, or who have business needs to hold data locally those benefits do not outweigh the need to store their company’s critical data locally. Consider a continuous process manufacturing facility, or a lean manufacturing automotive supplier with swinging penalties if the production line is stopped and where workers need to keep the production line humming 24 x7 and avoid production delays, especially. Industries rely on expensive machines to produce goods; these machines are highly automated, working on inputs from the ERP systems thye provide critical output signals that are used for overall management of the production facility. In such scenarios, it is desirable to run these processes locally while leveraging the power of the cloud to maximize efficiency. Also, some geographies require that transactions and personal information be captured and stored locally. other factors that may are apply are when :

• Your company’s leadership holds a non-discretionary attitude towards cloud computing
• Your business model needs control of your: data, data residency, and data isolation
• Your business information needs permanent connectivity
• The business processes require the ability to customize the service infrastructure of the ERP system to meet specific business needs such as scalability
• Your company has recently invested in data center resources
• Your industry’s regulations do not permit data be stored in the cloud
• Your country or geographic area has connectivity limitations

The on-premise deployment option is referred to by Microsoft as, “local business data.” Local business data will run your business processes on-premises, and will support local transactions and storage of business data, without replication of that business data to the cloud. The update of business data in the cloud is simply switched off.

For this “local business data” deployment scenario, the application servers and SQL database will run in the customer’s own data center ( or could be hosted in a private cloud).

Customers and partners will still be able manage the application lifecycle through Life Cycle Services (LCS) in the Microsoft Cloud, including designing the business processes, creating and deploying the software image to deploy onto the on-premises nodes, monitoring the on-premises nodes in a system health dashboard, and keeping up with innovation from Microsoft.

Should your views change and your company later wants to utilize Dynamics 365 for Operations cloud capabilities then you can turn ON data synchronization to the cloud. Please note: the on-premise data will not benefit from Microsoft’s intelligent cloud capabilities like Power BI, AI, or other capabilities available to cloud subscribers.

So :
• New and existing customers will be able to license both local business data, and cloud and edge deployments.
• Customers can license local business data deployments via a Dynamics 365 for Operations license with Software Assurance/Enhancement Plan or a subscription model
• Customers with active Microsoft Dynamics Enhancement Plans or Software Assurance, may upgrade to local business data and remain entitled to access new versions and updates
• Transitions to Dynamics 365 cloud subscriptions are available to customers with active Microsoft Dynamics Enhancement Plans or Software Assurance
• Cloud and edge software and services will be licensed under the existing Dynamics 365 cloud subscription licensing model

We like the cloud for first time ERP implementations that are small or ones that have just a few people using AX/Operations. Its also suitable for companies who have limited integrations/customisations.

If you are an enterprise system with multi geographical units then you will come under different statutory regimes for taxes and payroll and each country’s economic budget may requite you to make system updates that cannot wait 8 hours or more for Microsoft to move.

For complicated manufacturing implementations, major companies with lots of licenses, or rapidly growing companies that are increasing their market share the current Ax 2012 on premise, or the on premise hybrid when it comes out, is for now a more pragmatic option. Uptime, system tuning when slow or unresponsive, minimizing downtime and scheduling it at your own convenience, dedicated support. plus the need for faster code moves and fast support response times are critical factors that need rapid access to the production server not currently possible on the cloud.

Expect a lot more details to emerge on this topic at the D365 Technical Conference in Seattle next Month and at the AXUG meet in Amsterdam early April.

Synergy will be at the AXpact CEO meeting the day before to meet with Chandru Shankar, Microsoft’s Manufacturing Industry Director, Sri Srinivasan, General Manager for Microsoft Dynamics 365 for Operations in the Cloud and Enterprise Group, and Mike Ehrenberg, Microsoft Technical Fellow to get more insight.
.

The cloud – just another tool set

February 23rd, 2017

The cloud is a term that has for long time been the subject full of hype.

Conferences, blogs, media outlets tell us the cloud is the answer, the cloud is cheaper, the cloud is the way of the future, the cloud handles your DR, the cloud manages BI, IoT, etc. Major vendors, none more than Microsoft, are pushing the message of SMAC : “social first”, “cloud-first” “analysis fist’ and “mobile first”.

For many of us reactions vary from confused, to concerned, or even angry. There are still many who dismiss the idea of ‘cloud anything’ when it comes to data.

The reality is that almost everyone uses the cloud daily, Google, Hotmail, GPS, Facebook, You tube etc.
The cloud is a tool, the can really enable you to solve issues, some unsolvable any other way, without getting caught up in the details of implementing every little part of the system. That’s a any idea most of us embrace, even if we never thought about it. How many of us deal with hardware? How many of you install or configure Windows? Who day to day worries about SQL backups, or has scripts/tools/products to auto implement the back up new databases?

The reality is that few of us have ever seen an email server, or an erp a production database server with their own eyes, despite connecting to many.

We all move at different paces. Some of us still deal with SQL Server 2008, 2005, 2000,. Some of us need to manage those platforms for years to come. Using one application on the cloud does mean now work for IT on premise. Helping to build applications on Azure SQL Database and deal with data integrity, quality, and security issues, through a remote connection. Device connectivity and internet connections, and printers still need managing.

The cloud gives us a new set of tools and services that take away some of the details and drudgery. Think electric saws and power drills instead of hand tools. Sometimes that’s fantastic, and it enables more rapid, more scalable deployment of resources. Sometimes it’s dangerous because the vendors haven’t completely thought through the process, or integration, or latency, or licensing and contractual term.

We shouldn’t be doing many tasks that can be easily automated and done better, faster, cheaper.
When we don’t need to manage day to day management of hardware, databases, and interfaces we can look upwards and outwards and consider to use the tools, to ensure our organizations continue to improve effectiveness and efficiency.

So what tools does Microsoft offer on azure?

Cortana Intelligence Suite is an example – Microsoft’s big data, machine learning, and analytics platform.
Whether you work with a company that buys “best of breed” solutions and you manually manage your data or you only have one system but you lack the ability or wherewithal to analyze it, CIS can help.

CIS has several components and each has an intended purpose, though you can do similar functions across several elements, and may only use a few of those for your business.

Microsoft Azure
Azure’s intent is to provide cloud hosted virtual servers, apps, and services to Microsoft’s customers.
Azure offers developers both Microsoft and third party tools to operate on the cloud and takes full advantage of the IoT. Microsoft is playing to win the internet. It already has more data centres than Google and Amazon combined. The rapid improvements of the platform are steadily winning over developers and IT professionals.

Azure Data Catalog
Use it to tie together different types of data together from many sources, when all the fields or tables may be labeled differently. The Azure Data Catalog allows a user the ability to look up a data source in a common place in the cloud. The future of data management will tie together those end users with business knowledge and allow them to identify data relevant to your IT team.

Azure Data Factory
Azure Data Factory provides tools to pull together all your data into one place, to transform it, and to distribute it! (Pelluru, 2016)

Azure Event Hub
The Azure event hub is used for truly Big Data and the IoT. It allows mass data to be brought in for analysis. and is the opening to funnel in large amounts of data, like: temperature monitoring on all your reactors or the status of sensitive product shipments. Consume the data e.g. for behavior analysis on a mobile device.

Azure Data Lake
If Azure Event Hub is the front door, then lake is the storage closet for large amounts of data.
Use the lake to store all your data in its original non-relatable format, and integrate with any data warehouse you already have. It differs from Blob storage solely mainly in the amount of data you’re collecting.

Azure SQL Database
The SQL database needs no introduction, but how do SQL databases relate to Azure? A SQL database is where your data resides in tables, with key identifiers, which makes it relatable to other sets of data. SQL is simply the coding language for this particular database. The Azure SQL database puts it in the cloud.
But I don’t want my data in the cloud! That’s not secure.” Really ? Do you even bother to look the door of your server room? Are you really up to date with all he security patches and hotfixes?. I challenge you to rethink your philosophy and do a little research on the matter. Storage in the cloud is generally more secure than an on-premises solution, depending on several factors.

Azure Machine Learning
AML is where you go to create your machine learning models and API’s. What is machine learning?: It’s an evolutionary self-learning and adapting process. A machine looks at the data and suggests something. If that something doesn’t make it better, then it will try something else. If that still doesn’t make it better, it will try something else. It will keep trying until it finds something that works. And then try to make it work better.
Deep Blue the IBM computer than won a chess match with Kasparov not only worked with brute force calculation but also developed its own heurstics to better evaluate which positions t analyse further.
Machine learning is also about predictive analysis based on data you feed the machine. One example is when you go to Amazon, or Google or Booking .com based on your past purchases/searches, they recommend items. That’s machine learning!

Correlation of data to establish possible relationships has many applications from advertising to analysing medical symptoms. How might one use machine learning in manufacturing? You could increase the yields of your finished goods when you have data based on a specific factor, what was done prior to the production (cleaning, maintenance, etc), which operator does the best job on a particular product, and even which raw materials to use. Based on how much data you put in on your raw materials, such as quality data, ML could help with product engineering.

The possibilities are nearly endless.
Azure HDInsight
HDInsight is Hadoop as a service. Hadoop is faster than moving those large files across your network. Hadoop is an open source, Java-based programming framework that supports the processing and storage of extremely large data sets in a distributed computing environment. It is part of the Apache project sponsored by the Apache Software Foundation. It allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage.

As the World Wide Web grew in the late 1900s and early 2000s, search engines and indexes were created to help locate relevant information amid the text-based content. In the early years, search results were returned by humans. But as the web grew from dozens to millions of pages, automation was needed. Web crawlers were created. An open-source web search engine called Nutch evolved – the brainchild of Doug Cutting and Mike Cafarella who wanted to return web search results faster by distributing data and calculations across different computers so multiple tasks could be accomplished simultaneously. During this time, another search engine project “Google” was in progress based on the same concept – storing and processing data in a distributed, automated way so that relevant web search results could be returned faster.

In 2006, Cutting joined Yahoo and took with him the Nutch project as well as ideas based on Google’s early work with automating distributed data storage and processing. The Nutch project was divided – the web crawler portion remained as Nutch and the distributed computing and processing portion became Hadoop (named after Cutting’s son’s toy elephant). In 2008, Yahoo released Hadoop as an open-source project. Today, Hadoop’s framework and ecosystem of technologies are managed and maintained by the non-profit Apache Software Foundation (ASF), a global community of software developers and contributors.

Azure Steam Analytics (ASA)
, Stream Analytics streams data over the cloud and uses it for near real time analysis. ASA compares new data coming in (by using the Data Factory “Move it!” to periodically move data copies from SQL DB to blobs for consumption) against historical data to simplify analysis and to identify outliers. Stream Analytics is easy use and offers many possibilities. Less than 15 minutes to set up a data connection for temperature and throw together a graph for visual consumption.

Power BI
Power BI brings visualization to your analysis through interactive reports within a browser, tablet, or mobile device. If you are a fan of PowerPivot or the eye-pleasing graphs out of Excel, you’ll enjoy the capabilities of Power BI. Show the C_Suite KPIs, top customers, and gross margins by product – all on one page.

Cortana
Speech is not the only way that you can interact with Cortana. Cortana can analyze visually as well audibly.
How about an app that can predict your age based on a picture of your face using Cortana?
What about QA tests based on comparing current images to approved sample images?.

Machine learning and telemetry. Microsoft itself collects data on what functions you use in their software and how you use it to better optimize the software and the use interface. . It does not collect data on personal use.

Security – Verizon Data Breach insights

February 18th, 2017

The 2017 Verizon Data Breach Digest, published Tuesday, found that the effects of a breach are spreading to even more parts of an enterprise, increasingly causing problems outside of IT.

They examined 16 different scenarios examined in the 2017 Digest drawn from Verizon’s Research, Investigations, Solutions and Knowledge (RISK) Team’s investigation of 1,400 breach cases over the past three years. The scenarios were broken up into the following four breach types:

1. The human element.
Those breaches in which humans:
- had been compromised,
- or simply made a mistake,
- or intentionally acted maliciously.

Two of the scenarios—hactivist attack and partner misuse—were labeled as “lethal.”

The hacktivist attack occurs when a hacker targets a company in response to a perceived injustice committed by the firm.
Partner misuse refers to an attack when an indignant stakeholder attacks the firm from the inside. Another example of this kind of breach is a disgruntled ex-employee.

2. Conduit devices

Conduit devices are points of entry by which an attacker gains access to an organization’s network. Mobile assault and IoT calamity were the names given to the lethal scenarios of this breach type.

- A mobile assault occurred refers to a business traveler who uses an unsecure Wi-Fi connection, which leads to his phone being compromised.

- An example of an IoT calamity is a major university that was breached through its connected vending machines and smart light bulbs.

3. Configuration exploitation

“From a system standpoint, misconfigured devices are the vectors of compromise; from a network standpoint, misconfigurations allow for easy lateral movement and avenues for data exfiltration.”

Lethal scenarios of this type are a DDoS Attack. and an ICS onslaught.

- An One example of a major DDoS attack is the Mirai botnet that took down the DNS provider Dyn, and almost took down an entire country.
- An ICS onslaught occurs when an industrial control system is compromised, and may lead toboth massive physical damage and data leaks.

4. Malicious software

In the Verizon report, none were labeled as lethal. Examples are traditional malware, RAM scraping, spyware, and keylogger software. The Digest lists the three primary purposes of malware as meant to “establish a beachhead, collect data, and exfiltrate data.”

To respond to a breach, the Verizon Data Breach Digest recommends taking the following five actions:
1.”Preserve evidence; consider consequences of every action taken.”
2.”Be flexible; adapt to evolving situations.”
3.”Establish consistent methods for communication.”
4.”Know your limitations; collaborate with other key stakeholders.”
5.”Document actions and findings; be prepared to explain them
.”

Email Insights – easy index and search for your emails.

February 18th, 2017

Email may have been a convenience 15 years ago, but now it’s a hassle most of us want to ignore whenever possible. To Search through a full email inbox in Outlook or Gmail is frustrating. There are numerous variables that you may want to use: sender, send date, topic, recipients, spelling variations, file size, with or without attachment, etc.

Even the search engine masters at Google, have not managed to come up with a way to fully address the shortcomings of email searches. Microsoft Garage, is Microsoft’s website for experimental projects> It maybe has solved the problem with Email Insights, an app for Windows 10 Anniversary Edition that indexes emails from both Outlook and Gmail and tries to make searching these simple.

Microsoft senior research director Suresh Parthasarathy summed up the team’s strategy : It wants to duplicate the reliability of a search engine. “It is not just about the algorithms, but about the user experience. We present a novel browser-like email experience that feels lightweight and works just like web search.” The web search backbone is even more apparent with the inclusion of: auto complete, spell checking, and even the inclusion of an “intent pane” that looks a lot like a Google featured snippet.

Email Insights is programmed with “fuzzy” name recognition, to suggest appropriate spellings based on your email history and the contents of your inbox. It can also reportedly abstract more than just name. According to Parthasarathy. “A user need not remember all the exact keywords or spellings for their queries,” he said. “The idea is to remove the cognitive load of a user while searching,” essentially fulfilling the same role as a traditional search engine.

The app provides an optional taskbar-pinnable search field, to further separate it from the typical email client.

Email Insights is designed to separate the user from the email inbox. Email Insights isn’t yet a full-featured email client replacement for Outlook or Gmail: It’s designed as a companion app to make using both of those simpler. If Email Insights’ new search algorithm lives up to the developers claims, then it willprobaly be included into future versions of Outlook. For now it’s a separate app with one specific purpose: Indexing and simplifying email searches.

Email Insights is currently a Windows 10 Anniversary Edition exclusive, and it only supports Gmail and Outlook accounts. If you meet those criteria then you can download it here.

• Light-weight email application focused on search.
• Mix of Relevance and Recency based results.
• Guided search through Email Autocomplete, Fuzzy person name search.
• Commands for quick action.
• Tabbed UI for convenience.

Dynamics 365 launch event Microsoft Gulf

February 5th, 2017

Last week Synergy staff attended the Dynamics 365 regional launch day. This gave insights into the Microsoft Dynamics solution portfolio. Siegfried Leiner the Principal Program Manager, Dynamics CRM Microsoft gave a ‘deep dive’ key note speech.

The event was kicked off by Samer Abu Ltaif Regional General Manager, Microsoft Gulf and Karim Talhouk
Regional Director, Microsoft Business Solutions, Microsoft Gulf who presented how digital transformation is happening in the Gulf. Steve Plimsoll Chief Digital and Data Officer, PWC, and Harris Mygdalis Deputy CIO, Commercial Bank of Dubai gave further insights.

This well attended event attracted customers with a wide range of requirements. Mobility, analytics, and integration were common themes.

Artificial intelligence what does it mean for your company in Dubai?i

January 21st, 2017

At the World Economic Forum annual meeting in Davos-Klosters, Switzerland on Tuesday, Microsoft CEO Satya Nadella explained that his company has designed its artificial intelligence (AI) solutions specifically to augment human workers, not replace them.

In a panel discussion with other tech leaders, Nadella said that AI, much like user experience, can be designed with a goal in mind. While some are designing AI systems to think like humans in order to replace them, he said, Microsoft is designing systems to assist humanity. Nadella said that Microsoft was utilizing “pragmatic principles that can guide AI creation.” He also mentioned that this isn’t a new strategy for Microsoft, as he pointed out this idea in his 10 rules for AI that he posted in June 2016.

Of the rules initially proposed by Nadella, the two most relevant are :

“A.I. must be designed to assist humanity,”
and
“A.I. must maximize efficiencies without destroying the dignity of people.”

In his original post he made on Slate, he wrote, “we want to build intelligence that augments human abilities and experiences. Ultimately, it’s not going to be about human vs. machine.”

Nadella also spoke on the mission of tech companies to democratize AI. “To me, the key right now in this next phase of AI is: How do we put tools so that others can create intelligence in every walk of life?,”

Brynjolfsson is an economist at the Massachusetts Institute of Technology (MIT) and co-author of The Second Machine Age, a book that asks what jobs will be left once software has perfected the art of driving cars, translating speech and other tasks once considered the domain of humans.
Some fear a vision of a world where computers entrench the power of a wealthy elite and push the majority into poverty.

Brynjolfsson identifies various astonishing technologies lining up to encroach on human labour. Such as always been the case with new technologies. Taxis generally work better than rickshaws, and now we have self-driving cars.. Lifts work 24 hours a day 7 days a week with no holidays, pay rises or pensions.

The question is whether the rate and adoption of technological change is of a different order in the digital information age to the industrial revolution. Unlike much of the 20th century we’re now seeing a falling ratio of employment to population. Maybe with an aging population pretty much globally (except maybe Middle East and Nigeria) machines taking over work from human may prove to be a necessity.

There has been much discussion of a tipping point being reached with many technologies- growth is logarithmic rather than linear. For most of the second half of the twentieth century the economic value generated – a country’s productivity – grew hand-in-hand with the number of workers. Since 2000 the two measures began to diverge. From the turn of the century a gap has opened up between productivity and total employment. That gap has widened significantly, and reflects continued economic growth but no associated increase in job creation. Despite all of Trump’s electioneering claim about jobs, Obama actually increased jobs, but many jobs were lost not to Mexican nd Chinese workers but to automation Caiifornian orange orchard growers are not looking to switch orange picking jobs from Mexicans to Americans they are automating the job. A wide range of studies shows that simple algorithms generally provide better decision than expert intuition. The City of London financial sector and is not so much threatened by Brexit and loss of jobs to Frankfurt but that its bankers will be replaced by machines that calculate faster and more consistently with less risk.

Another trend over the past quarter of a century the income gap between the richest and the poorest in OECD countries has continued to widen. Today the average income of the richest in these countries is nine times that of the poorest. . For the first time since the Great Depression, over half the total income in the United States went to the top 10 percent of Americans in 2012. Between 1973 and 2011 the median hourly wage in the US barely changed, growing by just 0.1 percent per year. This wealth gap isn’t restricted to America. In Sweden, Finland and Germany, income inequality has grown more quickly over the past 20 to 30 years than in the US.
The message again is be smarter than a machine, or cheaper.

Production in the second machine age depends less on physical equipment and structures and more on the four categories of intangible assets: intellectual property, organizational capital, user-generated content, and human capital,” the book states, and points out that in the US, the share of GDP that goes to labor has declined over the past decade, falling to its lowest point in the third quarter of 2010, 57.8 percent. Muhc more ahs gone to assets and to owners.

The Nobel Prize-winning economist Wassily Leontief stated as long back as 1983 that ‘the role of humans as the most important factor of production is bound to diminish in the same way that the role of horses in agricultural production was first diminished and then eliminated by the introduction of tractors’.”

John Maynard Keynes, forecast it as an inevitable outcome of society discovering ways to make labour more efficient more rapidly than finding new uses for labour. There’s no economic law that says ‘You will always create enough jobs or the balance will always be even’, it’s possible for a technology to dramatically favour one group and to hurt another group, and the that overall have fewer jobs.

A law firm may not hire as many researchers because a machine can scan through hundreds of thousands or millions of documents and find the relevant information for a case or a trial much more quickly and accurately than a human. Call centre operators gradually are gradually being replaced by question-answering, automated systems. Cortana on your pc shows the power of a digital assistant with voice recognition. Carl Benedikt Frey and Michael A. Osborne from Oxford Martin School & Faculty of Philosophy in the UK wrote “According to our estimate, 47 percent of total US employment is in the high risk category, meaning that associated occupations are potentially automatable over some unspecified number of years, perhaps a decade or two,” they predict in the report The Future of Employment. During the coming decades they forecast two “waves of computerisation” during which different categories of jobs will be washed away, with no field of employment left untouched.

“In the first wave, we find that most workers in transportation and logistics occupations, together with the bulk of office and administrative support workers, and labour in production occupations, are likely to be substituted by computer capital. As computerised cars are already being developed and the declining cost of sensors makes augmenting vehicles with advanced sensors increasingly cost-effective, the automation of transportation and logistics occupations is in line with the technological developments documented in the literature. Furthermore, algorithms for big data are already rapidly entering domains reliant upon storing or accessing information, making it equally intuitive that office and administrative support occupations will be subject to computerisation. The computerisation of production occupations simply suggests a continuation of a trend that has been observed over the past decades, with industrial robots taking on the routine tasks of most operatives in manufacturing. As industrial robots are becoming more advanced, with enhanced senses and dexterity, they will be able to perform a wider scope of non-routine manual tasks. From a technological capabilities point of view, the vast remainder of employment in production occupations is thus likely to diminish over the next decade.”

They also predict disruption to jobs in services industries from personal and household services robots, automation of more routine sales roles, such as cashier and telemarketers, and from prefabrication of buildings to construction jobs.

Digital technology drastically reduces the cost of technologies, as well as the infrastructure and people needed to support those industries that use the technology. Digital photography us an eample Brynjolfsson and McAfee point out, in an age where our photos sit on hard drives rather than in ring-bound albums, the need for a large number of workers disappears. “These photos are all digital, so hundreds of thousands of people who used to work making photography chemicals and paper are no longer needed. In a digital age, they need to find some other way to support themselves

Once an algorithm is digitised it can replicated and delivered to millions of users at almost zero cost.
Reducing the cost of using that technology to the point where it is accessible to much larger numbers of people, empowers them to build new businesses. The web led to an explosion in online companies. For example of courses taught by some of the world's most prestigious institutions are now freely available online. While the cost of taking photos may have plummeted the same cannot be said of many essentials people need to survive — food, drink and fuel. Companies like Instagram and Facebook employ a tiny fraction of the people that were needed at Kodak. Nonetheless, Facebook has a market value several times greater than Kodak ever did and has created at least seven billionaires and they employ a lot of people in hitherto unknown jobs.

So is all bad news? Some jobs remain exceptionally tricky for robots, even simple manual tasks like walking over uneven terrain are beyond the capabilities of most modern bots. The phenomenon is known as Moravec's Paradox, an observation by leading AI researchers in the 1980s that computers found hard the tasks we found easy and vice versa. While it might take a human seconds to fold a towel, a robot made to carry out the task in 2010 took nearly 25 minutes. Cooks, gardeners, repairmen, carpenters, dentists, and home health aides are not about to be replaced by machines in the short term. All of these professions involve a lot of sensorimotor work, and many of them also require the skills of ideation, large-frame pattern recognition, and complex communication. "Machines are still very clumsy they don't have the agility, and few if any robots can pick up a coin that's on a desk, even though a two or three-year-old person child could.

A calculator makes an accountant more efficient but does not replace the end for an accountant.
Digital technologies a in many ways similarly can complement, not substitute for, creativity.

The stunted social skills of machines should mean that salespeople, managers and entrepreneurs have a reasonably bright future, as will nurses, kindergarten teachers and home help aids.

The career advice that Google chief economist Hal Varian frequently gives is: "Seek to be an indispensable complement to something that’s getting cheap and plentiful.”

Digital revolution – does it apply to me? ask Synergy Software Systems

January 19th, 2017

Digital transformation has been a hot topic for at least he last 5 years and is increasingly become reality. for those of us who lived through the re-engineering of the early 90s this seems to be another twist of a familiar tale. However there are major differences.Digital is no longer the shiny front end of the organization – it’s integrated into every aspect of today’s companies. As digital technologies continue to transform the economy, many leaders are struggling to set a digital strategy, shift organizational structures, and remove the barriers that are keeping them from maximizing the potential impact of new digital technologies.

A workable definition is that Digital disruption is the change that occurs when new digital technologies and business models affect the value proposition of existing goods and services.

A current aphorism is that you either have to be smarter than a robot or cheaper.

Recent developments in robotics, artificial intelligence, IoT, digital printing, virtual reality, and machine learning have put us on the cusp of a new automation age. Robots and computers can perform a range of routine physical work activities better and cheaper than humans, and are now capable of using cognitive capabilities once considered too difficult to automate successfully, such as making tacit judgments, sensing emotion, or even driving. Automation is already changing the daily work activities of everyone, from retailers, miners and landscapers to commercial bankers, fashion designers, welders, DBAs and CEOs.

What will the impact be on productivity? previous technical revolutions such as the introduction of the steam engine, or personal computing, delivered annual productivity increases of less than 1%. The speculation now is that new changes will increase productivity by 1 to 1.5 % an unprecedented rate of change, with many economic, social and political implications.

Fifty-two percent of the Fortune 500 since 2000 have merged, been acquired, or gone bankrupt since 2000.
A study by Richard Foster from Yale, shows that in the the SMP 500, the average age of a company in 1959 was about 58 years. It’s now down to 15, and it’s going to be 12 by 2020. There’s no time to wait. Digital Darwinism is unkind to those who wait.

“We’re talking about a three to four-times compression in terms of age of a company since the 50s and 60s. So, if you’re not making the shift, if you’re not even moving in that direction, you’re probably going to be merged or acquired, or go bankrupt.”

According to the new MIT Sloan Management Review and Deloitte University Press report, “Aligning the Organization for Its Digital Focus”, nearly 90% of more than 3,700 business executives, managers and analysts from around the globe say that they anticipate that their industries will be moderately, or greatly disrupted by digital trends. Yet less than half (44%) currently believe their organization is adequately preparing for this digital disruption. Ray Wang

Also see this HBR post for similar survey results

The most disrupted industries typically suffer from a perfect storm of two forces. First, low barriers to entry into these sectors lead to more agile competition. Secondly, they have large legacy business models which often generate the majority of their revenue. These organizations, therefore, have embedded cultural and organizational challenges when it comes to changing at the pace required. Digital companies can reach new customers immediately and at virtually zero marginal cost. They can compete in new sectors by collaborating with peers and competitors.

In the first wave of the commercial Internet, the dot-com era, falling transaction costs altered the traditional trade-off between richness and reach> Rich information was suddenly communicated broadly and cheaply, and changed how products are made and sold. Strategists made hard choices about which pieces of their businesses to protect and which to abandon. The learned to repurpose some assets to attack previously unrelated businesses. Virtual companies relied on outsourcing and offshore and owned little and made nothing. Incumbent value chains were “deconstructed” by competitors focused on narrow slivers of added value. Traditional notions of who competes against whom were upended—Microsoft gave away Encarta on CDs to promote sales of PCs and incidentally destroyed the business model of the venerable Encyclopædia Britannica.

With Web 2.0, the economies of mass scale evaporated for many activities nd small became beautiful. It was the era of the “long tail” and of collaborative production on a massive scale. Minuscule enterprises and self-organizing communities of autonomous individuals surprised us by performing certain tasks better and more cheaply than large corporations. Hence Linux, hence Wikipedia and Open source. Those communities grow and collaborate without geographic constraint, and major work is done at significantly lower cost – and oftenat zero price.

Many strategists adopted and adapted to these new business architectures. IBM embraced Open Source to challenge Microsoft’s position in server software; Apple and Google curated communities of app developers so that they could compete in mobile; SAP recruited thousands of app developers from among its users; Facebook transformed marketing by turning a billion “friends” into advertisers, merchandisers, and customers.

Where are we now? Hyperscaling and connectivity. Big—really big—is now beautiful. The cloud, new databases, new processing power, new BI tools, predictive analytics, data from IoT correlated with contextual search, delivered anytime anywhere on any device. Social media and smart phones are ubiquitous and real time news and peer opinion is replacing traditional news channels, and marketing and government communications. At the extreme—where competitive mass is beyond the reach of the individual business unit or company—hyperscaling demands a bold, new architecture for businesses.

We are only at the beginning of what the World Economic Forum calls the “Fourth Industrial Revolution,” characterized not only by mass adoption of digital technologies but by innovations in everything from energy to biosciences. The digital consumer, who enjoys more interactive and personalized experiences thanks to SMAC (social, mobile, analytics and cloud) technologies; the digital enterprise, which leverages SMAC technologies to optimize the cost of corporate functions and to transform enterprise collaboration for greater productivity; and the emerging digital operations wave, where companies are revolutionizing business with the use of artificial intelligence, robotics, cognitive computing and the Industrial Internet of Things.

Speculation about the effects of technologies often suffer from extreme optimism or pessimism. In the 1930s, several countries were enthusiastically experimenting with using new rocket technology to deliver mail, and in 1959, the United States trialed mail delivery via cruise missile, a proposition that could now be regarded as comical yet it ahs surfaced again with drone deliveries.

Jo Caudron and Dado Van Peteghem in their book Digital Transformation highlight 10 business models behind digital disruption. Professor Michael Wade, co-director of the IMD Leading Digital Business Transformation course has highlighted 7 strategies to respond to disruptors.

10 Hyper-Disruptive Business Models
1.The Subscription Model (Netflix, Dollar Shave Club, Apple Music) Disrupts through “lock-in” by taking a product or service that is traditionally purchased on an ad hoc basis, and locking-in repeat custom by charging a subscription fee for continued access to the product/service
2.The Freemium Model (Spotify, LinkedIn, Dropbox) Disrupts through digital sampling, where users pay for a basic service or product with their data or ‘eyeballs’, rather than money, and then charging to upgrade to the full offer. Works where marginal cost for extra units and distribution are lower than advertising revenue or the sale of personal data
3.The Free Model (Google, Facebook) Disrupts with an ‘if-you’re-not-paying-for-the-product-you-are-the-product’ model that involves selling personal data or ‘advertising eyeballs’ harvested by offering consumers a ‘free’ product or service that captures their data/attention
4.The Marketplace Model (eBay, iTunes, App Store, Uber, AirBnB) Disrupts with the provision of a digital marketplace that brings together buyers and sellers directly, in return for a transaction or placement fee or commission
5.The Access-over-Ownership Model (Zipcar, Peerbuy, AirBnB) Disrupts by providing temporary access to goods and services traditionally only available through purchase. Includes ‘Sharing Economy’ disruptors, which takes a commission from people monetising their assets (home, car, capital) by lending them to ‘borrowers’
6.The Hypermarket Model (Amazon, Apple) Disrupts by ‘brand bombing’ using sheer market power and scale to crush competition, often by selling below cost price
7.The Experience Model (Tesla, Apple) Disrupts by providing a superior experience, for which people are prepared to pay
8.The Pyramid Model (Amazon, Microsoft, Dropbox) Disrupts by recruiting an army of resellers and affiliates who are often paid on a commission-only model
9.The On-Demand Model (Uber, Operator, Taskrabbit) Disrupts by monetising time and selling instant-access at a premium. Includes taking a commission from people with money but no time who pay for goods and services delivered or fulfilled by people with time but no money
10.The Ecosystem Model (Apple, Google) Disrupts by selling an interlocking and interdependent suite of products and services that increase in value as more are purchased. Creates consumer dependency.

Business leaders are now more intent on disrupting before they are disrupted. How can you drive value from data in new ways? How can you shorten product development cycles? How can you tap into predictive analytics and social media to determine the right strategy?. Success is not just changing strategies, increasingly the need is for agility to execute multiple strategies concurrently. Such success requires CEOs to develop new leadership capabilities, new workforce skills and new corporate cultures and processes to support digital transformation. Mobile, ‘work from home’, BYOD, self-service, collaboration tools like Yammer and Team and Skype Business, e-payments, digital signatures, are tools that offer new ways of working.

For society, the implications of the Fourth Industrial Revolution are profound – from saving lives to creating jobs to better stewardship of the environment. For example our Healthcare solution built on Dynamics CRM, is providing guided pathways to the optimal route for treatment and is delivering huge cost savings and more effective care delivery by better targeting and use of resources.

Strategies to Respond to Digital Disruption
1.The Block Strategy. Using all means available to inhibit the disruptor. These means can include claiming patent or copyright infringement, erecting regulatory hurdles, and using other legal barriers.
2.The Milk Strategy. Extracting the most value possible from vulnerable businesses while preparing for the inevitable disruption
3.The Invest in Disruption Model. Actively investing in the disruptive threat, including disruptive technologies, human capabilities, digitized processes, or perhaps acquiring companies with these attributes
4.The Disrupt the Current Business Strategy. Launching a new product or service that competes directly with the disruptor, and leveraging inherent strengths such as size, market knowledge, brand, access to capital, and relationships to build the new business
5.The Retreat into a Strategic Niche Strategy. Focusing on a profitable niche segment of the core market where disruption is less likely to occur (e.g. travel agents focusing on corporate travel, and complex itineraries, book sellers and publishers focusing on academia niche)
6.The Redefine the Core Strategy. Building an entirely new business model, often in an adjacent industry where it is possible to leverage existing knowledge and capabilities (e.g. IBM to consulting, Fujifilm to cosmetics)
7.The Exit Strategy. Exiting the business entirely and returning capital to investors, ideally through a sale of the business while value still exists (e.g. MySpace selling itself to Newscorp)

As the world moves to amore digital future so the threats change. There is more data so there is more to steal and to corrupt. Security threats- phishing Trojans, malware, hacking of politicians email or government or company files or credit card details are now major challenges for all. As data grows-(how many hours of you tube video get uploaded each second) topics like high speed internet, and edge computing become more important.

Microsoft organisational changes from 1 February 2017.

January 11th, 2017

Microsoft is combining its Small and Mid-Market Solutions & Partners (SMS&P) and Enterprise Partner Group (EPG) business units in an attempt to streamline business processes. The changes, which will take effect from February 1, will affect its sales, partner, and services teams, and will see both units come together as one under its Worldwide Commercial Business, led by executive vice-president, Judson Althoff. Corporate vice-president of mid-market solutions and partners, Chris Weber, will lead the combined business.

This seems to echo former CEO Steve Ballmer’s 2013 One Microsoft plan. No layoffs are expected

In Australia, Mark Leigh runs the SMS&P business after David Gage resigned from the role. As for its local EPG business, the head of the unit is yet to be filled as Steven Worrall was given the managing director title after Pip Marlow left the company. However, how these changes will affect Microsoft Australia and New Zealand are yet to be determined.

This move follows the recent departure of then Microsoft chief operating officer, Kevin Turner, whose role was not replaced and was split amongst five senior executives including Althof. As part of that restructure, Althoff was handed the Worldwide Commercial Business, focusing on the Enterprise and Partner Group, Public Sector, Small and Midmarket Solutions and Partners, the Developer Experience team, and services.

The company restructure also sees the creation of a new One Commercial Partner business, which combines various partner teams within Microsoft; a unit called Microsoft Digital, which is expected to grow Microsoft’s cloud division; and the merger of its Worldwide Public Sector and Industry businesses. it will be led by former Salesforce vice president and Microsoft’s current Corporate Vice President of Enterprise Partner Ecosystem, Ron Huddleston. The new group called Microsoft Digital will push Microsoft’s current customers and partners to use the company’s cloud programs. Anand Eswaran, corporate vice president of Microsoft Services, will lead that group.

Corporate Vice President of Worldwide Public Sector Toni Townes-Whitley will lead a combined group comprising Microsoft’s Worldwide Public Sector and Industry Businesses

Jeff Teper, who was Microsoft’s corporate vice president of corporate strategy, announced on Twitter last week he now leads the company’s OneDrive and SharePoint teams. It’s a familiar role, as Teper led the group that first built SharePoint for its 2001 launch. The move seems to be the latest to make room for Kurt DelBene, who was brought back to the executive team after retiring in 2013 to help the U.S. government fix the healthcare.gov website. DelBene assumed a new title as executive vice president of corporate strategy and planning in April. (Soon after, Eric Rudder, executive vice president of advanced strategy, and Mark Penn, executive vice president of advertising and strategy, announced they would be leaving Microsoft.)

David Treadwell, a longtime Microsoft executive who oversaw the Windows engineering team, is also on the move. He’s taking an unidentified role in the Cloud and Enterprise group. Treadwell told staff he was reluctant to leave the Windows team, but “when the CEO calls, well, you take that call.”

According to Microsoft’s announcement, Kim Akers and the ISV team, Victor Morales and the Enterprise Partner team, and Gavriella Schuster and the WPG team will all be moving into One Commercial Partner.

Ransomware was on the rise throughout 2016.

January 10th, 2017

49% of businesses fell victim to cyber ransom attacks in 2016

Ransom is the top motivation behind cyberattacks, – Radware’s Global Application and Network Security Report 2016-2017
The report listed five cybersecurity predictions for 2017:
1. IoT will become an even larger risk. The Mirai IoT Botnet code is available to the public, making it more likely that cyber criminals of all experience levels are already strengthening their capabilities. In 2017, exponentially more devices are expected to become targeted and enslaved into IoT botnets. IoT device manufacturers will have to face the issue of securing their devices before they are brought to market, as botnet attacks from these devices can generate large-scale attacks that easily exceed 1 Tbps.
2. Ransomware attacks will continue to grow. These attacks will target phones, laptops, and company computers, and will likely take aim at healthcare devices such as defibrillators in the future, the press release stated.
3. Permanent Denial of Service (PDoS) attacks on data centers and IoT operations will rise. PDoS attacks, sometimes called “phlashing,” damage a system to the degree that it requires hardware replacement or reinstallation. These attacks are not new, but Radware predicts they are likely to become more pervasive in 2017 with the plethora of personal devices on the market.
4. Telephony DoS (TDoS) will become more sophisticated. These attacks, which cut off communications in a crisis, could impede first responders’ situational awareness, exacerbate suffering and pain, and potentially increase loss of life.
5. Public transportation system attacks will rise. As cars, trains, and planes become more automated, they also become more vulnerable to hackers, Radware stated.

To avoid ransomware attacks and other cyber threats: keep software up to date, back up all information every day to a secure, offsite location (e.g. Azure cloud back up), segment your network, performing penetration testing, train staff on cyber security practices.
Ensure passwords are strong and are regularly updated
Ensure you have deployed appropriate anti virus / anti-malware tools.
Test your back up and restore periodically.
Ensure your support contracts are up to date.
Don’t forget your hardware e.g. out of date protocols on routers may be targets for hackers.
If you have large complex networks and critical data and up-time requirements, then consider ethical-hacking penetration testing.
Managed services solutions can monitor your networks and services to ensure critical hardware and services are functioning.

Azure – what is it exactly?

January 8th, 2017

You may have recently seen a television commercial for “The Microsoft Cloud,” which featured Healthcare, Cancer Research, and Cybercrime. So, what does this have to do with Microsoft Azure?

Microsoft Azure is the Microsoft product name for the Microsoft Cloud. The names are used synonymously in the technical industry.

The Cloud digital transformational shift, question remains, “What is Azure, and for whom is it meant?”

Azure was announced in October 2008 and released on February 2010 as Windows Azure, and was then renamed to Microsoft Azure in March 2014.

Azure is a cloud computing platform plus, the underlying infrastructure and management services created by Microsoft to build, deploy, and manage applications and services through a global network of Microsoft-managed data centers.

What Microsoft Azure Data Centers?

There are 34 interconnected Microsoft Data Regions around the world with more planned.

Microsoft describes Azure as a “growing collection of integrated cloud services, including analytics, computing, database, mobile, networking, storage, and web.” Azure’s integrated tools, pre-built templates and managed services simplify the task of building and managing enterprise applications (apps).

Microsoft Corp. CEO Satya Nadella calls Azure, “the industry’s most complete cloud — for every business, every industry and every geography.”

The Complete Cloud

For many businesses, their first foray into leveraging cloud software as a service (SaaS) is with Microsoft Office 365, Exchange online for hosted email, or CRM online for managing business and customer relationships. However, the Azure platform is much more than just an online business software delivery platform.

Here are just a few of the things that you can do with Azure:
• Build and deploy modern, cross platform web and mobile applications.
• Store, backup and recover your data in the cloud with Azure-based disaster recovery as a service (DRaaS).
• Run your Line of Business applications on Azure.
• Run large scale compute jobs and perform powerful predictive analytics.
• Encode, store and stream audio and video at scale.
• Build intelligent products and services leveraging Internet of Things services.

Use Azure, and your partner, to rapidly build, deploy, and host solutions across a worldwide network and to create hybrid solutions which seamlessly integrate on premise existing IT with Azure.

Many leverage Azure to protect data and meet privacy standards like the new international cloud privacy standard, ISO 27018, or HIPAA.

Azure customers can quickly scale up infrastructure, just importantly, scale it down, while only paying for what they use.

Azure also supports a broad selection of operating systems, programming languages, frameworks, tools, databases and devices.

Contrary to the perception that Azure is for Windows only, nearly 1 in three Azure virtual machines are Linux.3

Widespread Adoption

More than 80 percent of Fortune 500 companies rely on Azure, which offers enterprise grade SLAs on services. In addition, Microsoft is the only vendor positioned as a Leader across Gartner’s Magic Quadrants for Cloud Infrastructure as a Service (IaaS), Application Platform as a Service (PaaS), and Cloud Storage Services for the second consecutive year.1

What is Microsoft Azure IOT

Microsoft’s powerful Azure Internet of Things Hub and tool suite has also been widely adopted for use in commercial and scientific applications to securely connect and manage Internet of Things (IoT) assets. The service processes more than two trillion IoT messages weekly.4

From broadcasting the Olympics to building massively multiplayer online games, Azure customers are doing some amazing things, and in increasing numbers. Microsoft recently revealed that the rate of Azure customer growth has accelerated to more than 120k new Azure customer subscriptions per month.4 In line with the accelerated adoption, the company is projecting an annualized commercial cloud revenue run rate of $20 Billion in 2018.3

Cloud Leadership

With Azure, Microsoft has made a huge commitment to cloud computing. Since opening its first datacenter, Microsoft has invested more than $15 billion in building its global cloud infrastructure.5 In addition, the company recently announced it would build its first Azure data center in France this year as part of a $3 billion investment to build its cloud services in Europe.6

Microsoft is quickly closing the gap in market share with IaaS provider Amazon Web Services, (AWS). While 37.1% of IT professionals surveyed indicated that Amazon AWS is their primary IaaS platform, Microsoft Azure is a close second at 28.4%, followed by Google Cloud Platform at 16.5%.7

and hot off the press…….
Microsoft isn’t building its own connected car — but it is launching a new Azure-based cloud platform for car manufacturers to use the cloud to power their own connected-car services.

The new Microsoft Connected Vehicle Platform will go live as a public preview later this year.
“This is not an in-car operating system or a ‘finished product’,” Microsoft’s EVP for business development Peggy Johnson writes in this week’s announcement. “It’s a living, agile platform that starts with the cloud as the foundation and aims to address five core scenarios that our partners have told us are key priorities: predictive maintenance, improved in-car productivity, advanced navigation, customer insights and help building autonomous driving capabilities.”

Microsoft also announced that it is partnering with the Renault-Nissan Alliance to bring the new connected-car services to Renault-Nissan’s next-gen connected vehicles. The two companies were already working together on other projects before this, so it’s maybe no surprise that Renault-Nissan is Microsoft’s first partner.
Microsoft is also working with BMW to develop that company’s BMW Connected platform on top of Azure. BMW and Nissan also showed in-car integrations with Microsoft’s Cortana digital assistant at CES this year, so your future car could potentially use Cortana to power its voice-enabled services. For the time being, though, it looks like these are still experiments.

Microsoft has talked about its aim to bring “intelligence” to as many of its services as possible. It has also recently opened up Cortana to third-party developers, so bringing it to its connected car platform is a logical next step (and we’re also seeing Amazon doing the same thing with Alexa, ).

Johnson also used today’s announcement to take a thinly veiled swipe at Google/Alphabet, which spun out its self-driving car unit a few weeks ago. “As you may have gathered, Microsoft is not building its own connected car,” she writes. “Instead, we want to help automakers create connected car solutions that fit seamlessly with their brands, address their customers’ unique needs, competitively differentiate their products and generate new and sustainable revenue streams.”

Year end close for your Windows operating system

December 30th, 2016

1: Upgrade applications

Update of applications should be part of a regular maintenance cycle, but its a task that sometimes falls through the cracks. Ensure that applications are always current, so as to maximize compatibility with newer hardware and to support the overall security posture of a system. Don’t head into 2017 with out-of-date software.

2: Back up data

This critical task should be performed on a regular basis to ensure that data is recoverable in the event of loss, theft, or catastrophe. If you don’t have a properly configured, automated backup scheme, then manually perform a full backup of all your data. All versions of Windows since Vista include a modern backup application built into the OS which allows backup to an external drive or to a shared folder on a network drive. Although not as robust a backup solution as some third-party offerings, it works as advertised and even allows for backups to run on a schedule.

3: Update Windows

Windows XP featured the ability to integrate systems updates automatically. Such a simple feature has continued to be streamlined into current Windows versions to assist in keeping machines patched against malware and security threats. Even so, millions of devices worldwide do not regularly receive system updates. I can’t think of a better time than the new year to develop the habit of performing system updates to protect your devices and keep them stable.

4: Clean temporary files/cache folders

With large amounts of data going back and forth online and increased reliance on web-based applications, the temporary folders and cache folders, including the cookies, that store all of this data can grow to unbelievable sizes in a short amount of time. To free up storage space—and to prevent this type of data from being used to compromise your system and/or accounts—it’s important to delete these temporary files to clean your system.

Among the many applications available that offer system cleaning utilities, CCleaner stands out as powerful and easy to use. Even the freeware version has enough capabilities to clean out all temporary folders and caches, and it can make storage space available with its handy scripts. You can set it to run upon startup, so that your system is always clean and functioning properly.

5: Update anti-malware and run a full-system scan

The popularity of Windows, makes it a magnet for security threats. An up to date malware detection system is often the only thing standing between keeping and losing your data.

Additional security protections, such as a firewall and web and email filtering should also be used. Free apps, such as Avira, Windows Defender, and Avast also rate highly, though they have a slight impact on system resources while offering excellent performance. For business look at a tool like Kapserksy.

6: Use System File Checker (SFC)

Windows files get modified both when system updates occur, and when applications get installed and upgraded. Those files can also be corrupted by malicious software or incomplete updates.

When system files aren’t as they should be, weird things will occur to your Windows installation.

To prevent Windows from acting erratically or failing to load the system and/or applications correctly, regularly run SFC—the built-in Microsoft utility to check and fix system file issues. Here’s how:
1.Launch CMD with elevated privileges.
2.Type sfc /scannow to begin the verification process for all system files. As the scan progresses, any corrupt files will automatically be corrected from the cache stored locally in the Windows directory.

7: Uninstall unused applications

We all use a variety of apps to get work accomplished. Some are small, while other are large suites. Over time some of these apps lose their viability and no longer serve their function, which presents several problems Unnecessary apps can use up resources and present security issues. If the apps are no longer being used and re also no longer supported by the developer, then there could be an even greater security risk. Close out the year by ridding yourself of these unused apps before data loss occurs.

8: Transfer Windows data from one PC to another

If you’re upgrading to a new PC or swapping out your gear, then transfer your account profile, including files & folders and settings, from your old PC to the new one. Sadly, Microsoft’s Windows Easy Transfer does not support Windows 10. However, Microsoft has a partnership with LapLink to officially provide Windows 10 support for its PCMover Express software ($14.99-29.99) to migrate data to a new Windows 10-enabled PC. The application also includes regular and enterprise editions that may be used over corporate networks and provides zero-touch support.

9: Perform a PC reset if your pc is constantly playing up/not working

From Windows 8 on, Microsoft has included recovery options to fix non-working computers, as well as adding the option to factory-reset an installation. This essentially deletes all user data, including apps, and reloads the Windows OS back to its defaults. Depending on the speed of the computer, the process will typically take two hours or so to complete.

To accomplish this, follow the steps below:
1.Go to Settings | Update & Security | Reset This PC | Get Started.
2.Choose the option Remove Everything, as the best option to fully clean the internal drive, settings, and all user data.

10: Reboot Windows to clear sleep/hibernation data

Most of us are guilty of this one on the PC We use the PC for work and when done, put it to sleep. Hardly ever do we reboot, and never shut down unless the system has become unstable or the battery runs out of power.

Each time the PC goes to sleep it stores copies of the working environment into RAM and hibernation files so that when the user wakes the system, they can resume where they left off. The problem is that the files never get flushed properly until a reboot, or shutdown. So they just sit there taking up space and potentially leaving a security vulnerability, since some system updates require a machine restart to complete properly.

11: Upgrade hardware

For those working on non-2016 Windows PCs, it may be a good time to to upgrade it by adding more RAM or swapping out a mechanical HDD for a solid-state drive. Or you consider upgrading to a larger external drive or adding some accessories, like a docking station, to boost performance.

If you choose to go the total system upgrade path, performing the tasks listed above will prepare your current PC for its new owner by ensuring that your data completely backed up and ready to be transferred to its new home and that the older equipment is in primo condition for the next user.

12. Clean up your desktop
Files on the desktop typically go into cache and eat up your memory.
consider storing all shortcuts in a folder and then put one shortcut to that on the desktop.

Learn to use all the feature

The taskbar calendar which now integrates with Windows 10’s core Calendar. Click the date and time in the right-hand side of your taskbar, the calendar that pops up includes a full look at your schedule for the day.

If you’d like to be able to just bark commands at your PC, open Cortana by clicking the search field in the taskbar and select the Notebook icon in the left-side options pane. Select Settings from the list, then simply enable the Let Cortana respond when you say “Hey Cortana” option. You’ll need an active microphone for this to work
Short cut keys

• Windows Logo Key + Ctrl + D: Use this combination to switch to a new virtual desktop. Why do you need this? Let’s say you’ve launched too many applications at the same time that you actually lose track of everything! What could be better than switching to a clean and slick desktop?
• Windows Logo Key + L: Use this combination to switch between accounts.
• Windows Logo Key + C: Use this combination to wake Cortana up in listening mode.
• Windows Logo Key + I: Use this combination to open settings panel or the co-called Control Panel.
• Win+Tab – Activates Task View (more on that later)
• Win+Q or Win+S – open up Search/Cortana in typing mode, perfect for Queries and Searches
• Win+Left or Win+Right – snap the current window to the left or right, talking up half of the screen space
• Win+Up – maximize a window
• Ctrl+Win+Left or Ctrl+Win+Right – switch to the next virtual desktop
• Ctrl+Win+D – create a new virtual desktop

Task View, is a fancier and more visual way to view all the open windows that you have. It presents windows in a grid-like arrangement versus the horizontal strip of Alt+Tab. You can group together windows for a certain topic or task. The advantage to this is that you can have, say, 10 windows open but only 4 or 5 are really visible at a time. The rest are hidden away on another virtual desktop and won’t show up on your taskbar or when you Alt+Tab. Of course, you can change that default behavior, too. Virtual Desktops are a great way to compartmentalize your activities so that you don’t get overloaded with unnecessary windows and just focus on the task at hand. Virtual Desktops don’t work in Tablet Mode. Task View, however, works as normal and is in fact the default way of switching windows (you can’t really Alt+Tab on a touchscreen).

Here’s a fun little easter egg. Create a new folder on the desktop and name it exactly as below (noting the period after “GodMode”):

GodMode.{ED7BA470-8E54-465E-825C-99712043E01C}

Once you hit Enter, the folder will change its icon and you will be presented with a folder that has a smorgasbord of settings all laid out in a single list. These are practically the entire contents of Control Panel and then some, but not the new Settings app. It might be handy to get an overview of everything there is to find as far as settings go, but you might still be better off using Search.

Offline Maps – you can now download certain maps for use even when you’re not connected to the Internet. Very handy for traveling. Just be sure to mind your storage space, as they can eat up quite a lot.

Customize the Start Menu using the “Ctrl” and “Arrow” keys to customize the size of the Menu

for many more tips look here

http://www.thewindowsclub.com/windows-10-settings

http://allbestposts.com/windows-10-tips-and-tricks/

Powershell – why?

December 19th, 2016

What is the point of PowerShell?

It handles any task that requires scripting and gives power back to the user, developer, or administrator. Power Shell is a tool that is at a very high level of abstraction, and thus can quickly provide a means of getting a task done by creating a chain of software tools without resorting to writing a compiled application.

PowerShell is an: extensible, open-source, cross-platform object-oriented scripting language that uses .NET.

It can use: COM, WMI, WS-Management and CIM to communicate with, and interact with, any Windows-based process.
It can execute scripts on either a local workstation or remotely.

It is ideal for automating all sorts of processes, and is simple enough to manage your workstation, and yet robust enough to manage SQL Azure.

It will evolve to become the built-in batch processing system in future versions of Windows.

It is important as a configuration management tool and task automation tool, and is versatile enough to be used as a general-purpose programming language.

How did PowerShell come about?

Unix inherited from its mainframe ancestors the use of the batch and its script.
The use of the script allowed UNIX to develop a group of specialized applications that did one job and did it well.
Data could be passed into an application through its standard input, and the results passed to the standard output which meant that data could be streamed like items on a conveyor belt.

It was like building repeatable processes out of Lego like blocks of code.

Scripting did more than encourage piped streams of data. It also encouraged batches and command-line configuration of machines and services. This made it easy to use Unix for servers, because all administration tasks could be scripted.

Scaling up to large groups of servers was smooth since everything was in place to allow it to happen.

Scripting also made the management of servers more precise and error-free. After the script was developed, no further work was needed. The script would do the same thing in the same order with the same result. It made operations work a lot easier.

Think tKorn Shell, with ideas from Bash shell.

Why didn’t Windows have a powerful scripting language like Korn?

Unlike the contemporary UNIX workstations, The first PCs had no pretensions to host server processes. Those were low-specification affordable personal computers and initially conquered the market previously occupied by dedicated word processors, before becoming ubiquitous with the invention of the spreadsheet.
They had the ability to run batches, but this was intended merely to ease the task of installing software. Scripting just seemed old-fashioned or best left to dos.

Microsoft DOS could and did run batches from the command processor, and autoexec.bat is still there in Windows (called AUTOEXEC.NT and located in the %SystemRoot%\system32 directory).

After MSDOS borrowed from the UNIX clone Xenix, this command processor took on some of the features of UNIX shells such as the pipe, but with limited functionality when compared to the UNIX shells.

Microsoft Windows was originally booted from the command processor, and , in later editions, it took over the tasks of the operating system and incorporated the old MSDOS command-line interface tool (shell).

The features of the batch were sufficient to allow it to do a lot of configuration, installation and software maintenance tasks. The system wasn’t encouraged or enhanced after Xenix was abandoned, but remained a powerful tool. Xenix’s replacement, Windows NT or WNT (add a letter to DEC’s VMS to guess its parent.) did not have anything new for the command processor, and inherited MSDOS’s enhanced version from MSDOS 3.3.

This Batch language still exists in the latest versions of Windows, though it is due to be deprecated. It has had quite a few enhancements over the years but essentially what came into MSDOS is still the basis of what is currently shipped. It has a major failing within a Windows environment that it cannot be used to automate all facets of GUI functionality, since this demanded at least COM automation, and some way of representing data other than text.

There have been attempts to replace the DOS batch file technology, including VBS and Windows Script Host (1998), but PowerShell is by far the most effective replacement.

Why did it take so long to get PowerShell?

Maybe because Microsoft under Bill Gates retained the vision that BASIC should remain the core language for Windows scripting and administration …

Basic scripting driving COM automation

Today the office applications still have, underlying BASIC scripting that can be controlled via COM automation. To keep batches consistent with this, the tasks done by batch scripting were to be done by Visual Basic for Applications: VBA. This was supplied with the operating system to drive all automation tasks. Wang’s Office system, similarly, was automated and scripted via Cobol!

Language -driven development and divergence

Over time each office application developed a slightly different incompatible dialect and could not be kept in sync. Visual Basic was inadequate for the task and evolved into vb.net, a somewhat comical dialect of Java. It proved to be unpopular.. VBA was never quite consistent with the Visual Basic used for building applications.

Windows Script Host

Windows Script Host was introduced as an automation and administration tool to provide automation technology, primarily for Visual Basic and JavaScript. it supported several interpretive languages such BASIC, Perl, Ruby, Tcl, JavaScript, Delphi and Python. Initially, it had security loopholes finally solved with digital signing in Windows XP. It is still installed with MS Windows and still provides a number of useful COM interfaces that can be accessed in PowerShell and any other application that can interact with COM.

Windows Script Host was, designed before .NET so it is not able to directly use the .NET library. It also does use WMI, WS-Management and CIM for administration and monitoring. It focused on manage the platform by using very low level abstractions such as complex object models, schema, and APIs. Although it was useful for systems programming it was almost unusable for the typical small, simple and incremental task that is at the heart of administration, which needs very high levels of abstraction.

Microsoft competes in the server market

Microsoft was focused on the desktop market for a long time, so maybe did not realize the scale of the problem to compete in the server market. The GUI-centric Microsoft culture and ecosystem, idea was that all configuration was a point-and-click affair. OK for one or two servers, but not so easy or error free for a server-room.

PowerShell

Due to the determination and persuasive powers of Jeffrey Snover, Microsoft belatedly woke up to the fact that it hadn’t a viable solution for the administration of a number of servers in a medium sized company.
The GUI didn’t scale, and the batch system of the command line, though useful, was stuck in mid-eighties time-warp.

Microsoft had to replace the command line; so it needed all the things it and other interactive shells had, such as aliases, wildcard matching, running groups of commands, conditional running of groups of commands and editing previous commands.
It also had to replace VBA, and to integrate easily with Windows Management Objects.
It had to take over the role of VBA embedded in applications to make automation easier.

Microsoft needed something that looked both backwards and forwards, i.e an industry standard shell backward compatible with the command Line.

PowerShell started with the POSIX standard shell of IEEE Specification 1003.2, the Korn Shell, which is also available in Windows. However, this dealt only with strings, so it had to be altered to also deal with objects so that it could access WMI, WS-Management, CIM and COM. Because it needed so much connectivity and data interchange, it had to be able to use the .NET library to process NET objects and datatypes.

So a new system also needed to understand .NET to utilize the man-years of work of providing a common object model able to describe itself, and that can be manipulated without converting either to or from text. The new scripting system had to be resolutely object-oriented.

So the new PowerShell needed the ability to use any .NET object or value.

PowerShell, was given an intuitive naming convention based on the verb-noun pair, with simple conventions such as ‘Get’ to get an object and a noun describing the object.

To replace the command line Powershell had to be better. The whole point of a command shell is that it must be convenient to type short commands into it e.g. like ‘REPL’ in Python. Powershell also needs to work with existing terse DOS command-line commands so that an expert can type in very truncated commands.

PowerShell was also to be used in scripts stored on disk and repeatedly invoked, with just a change in parameters. This also meant that it had to be easy to read, with intuitive commands and obvious program flow.

It wasn’t an easy compromise, but it was done by means of aliases. Aliases also helped to ‘transition’ users from the other shells they were using to PowerShell (For CMD.EXE it is dir, type, copy etc, for UNIX ls, cat, cp etc.) You can even define your own in Power Shall!

Powershell took an idea from.NET everything should be learnable by discovery, without needing documentation. All the objects and Cmdlets in Powershell are self-documenting in that you can use PowerShell to find out what they do, what functions can be called, and parameters.

Why is the PowerShell Pipeline Important?
The pipeline in PowerShell inherited the concept of a pipe from UNIX. The PowerShell team had to solve the problem of dealing with Windows Management Objects and Instrumentation by passing objects rather than text down the pipe.

Having done so, it found itself in possession of a radical and extraordinarily useful system. It had the means of processing objects as though they were on a conveyor belt, with the means of selecting and manipulating each one as it passed down the pipeline.

This made the code easier to understand and also helped with memory management. A long file could be passed down a pipeline, line-by line, for example, searching for text, instead of having to read the entire file into memory (you can do that too if you want, and if you have no fear of the large object stack; you have to do it if you want, for example, to order the lines). . It also meant you needed only one cmdlets for selecting things, one for sorting, one for grouping , and only one for listing things out in a table. PowerShell could do a lot in a line of code, far, far, more than C# could.

Suddenly, the task of tackling the huge range of data on the average server that one might need to know about was less frightening. It was already there, and was now easy to get at and filter.

Why is PowerShell useful?
Scripts don’t require special components.

PowerShell now has all the power of a compiled .NET language. Automating processes using the windows command line needed many existing command files to determine settings and to configure. This meant that, the developer often had to write components in a compiled language. In developing scripts, part of the time was spent making small commands. This isn’t necessary in PowerShell thanks to .NET.

PowerShell simplifies the management of hierarchical data stores. Through its provider model, PowerShell lets you manage data stores such as the registry, or a group of SQL Servers using the same techniques of specifying and navigating paths that you already use to manage files and folders.

This doesn’t turn PowerShell into a rival to C#, VB-Net, ActionScript or F#.

It is not for developing applications but for automating administrative tasks.

It is theoretically possible to write a webserver in PowerShell, or an interactive GUI using Windows Presentation Foundation but that is not its purpose.

What is PowerShell’s main use?

Traditionally, the command line was used for complex deployments. BPowerShell can work remotely on any computer in the domain, and give far more information about the computer. It quickly became the default means of deployment for Windows.

This is great for the developer. He develops his package in NuGet and can use Chocolatey to deploy it. Linux allows you to install a package with just one simple command. Chocolatey does the same, but also allows you to update and uninstall the package just as easily. A simple script can grab the latest source from Git, compile it, install it and any dependencies, and do any special configuration tasks. There are 4,511 packages you can install from the Chocolatey site. PowerShell now has its own package manager but the current incarnation isn’t as versatile as Chocolatey.

Server Administration.

The release of PowerShell was most welcomed by the server teams.
The Microsoft Exchange Server team were early adopters and used PowerShell to allow the administration of Exchange.
The Microsoft SQL Server team, and Active Directory team also followed suit.
These teams provided specialized Applets that covered all aspects of the administration of the server.

Windows Server now has the capabilities of using Hyper-V to provide a ‘private cloud’ which allows companies to allow a degree of ‘self-service’ for server resources – all driven and maintained by PowerShell

Provisioning.

Provisioning is one of the areas where PowerShell excels.

PowerShell’s DSC package allows a PowerShell script to specify the configuration of the machine being provisioned, using a declarative model in a simple standard way that is easy to maintain and to understand.

It can either ‘push’ the configuration to the machine being provisioned, or get the machine to ‘pull’ the configuration.

Chocolatey, a PowerShell script, can not only install a large range of software, but also update it or remove it.

PowerShell has a built-in system called ‘PackageManagement’ that isn’t so versatile, but which allows you to install packages from a wider variety of sources.

Use PowerShell within an application or website
As well as providing a scripting environment, PowerShell can be embedded into an application by using System.Management,Automation , so that the user of the application can extend it via scripts. You can even do this in ASP.NET

Parallelism and workflow
Although PowerShell is an interpreted dynamic language (using .NET’s DLR) , its performance is enhanced by its ability to run parallel processes and to be able to run asynchronously. It is also designed to be able to run securely on other machines, remotely, and pass data between them. All this is possible without Workflow.

Scripted processes that have complex interdependencies, need to be interruptable and robust, and that is supported by PowerShell workflow.

Workflow can be complicated, and will always be a niche technique for scripting. . it is now able to run complex workflows within a domain thereby making it possible to script even the most difficult of business processes that contain long-running tasks that require persistence and need to survive restarts and interruptions.

PowerShell uses the Windows Workflow Foundation (WF) engine. A PowerShell workflow involves the PowerShell runtime compiling the script into Extensible Application Markup Language (XAML) and submitting this XAML document to the local computer’s Workflow Foundation engine for processing.

PowerShell Workflow scripting is particularly useful in high availability environments for processes such as ETL (data Extraction, Transform and Load), that potentially requiring throttling and connection pooling, and it is ideal where data must come from a number of sources and be load in a certain order.

SQL 2016 Sp1- this is a big deal – Synergy Software Systems

November 19th, 2016

In addition to a consistent programmability experience across all editions.

SQL Server 2016 SP1 also introduces all the supportability and diagnostics improvements first introduced in SQL 2014 SP2, as well as new improvements and fixes centered around performance, supportability, programmability and diagnostics based on the learnings and feedback from customers and SQL community.

SQL Server 2016 SP1 also includes all the fixes up to SQL Server 2016 RTM CU3 including Security Update MS16–136.

SQL editions have traditionally been differentiated by features- this meant that essential features for day to day database use were not present in express or standard versions. Our view is that this is not desirable and that ther is core set of features needed in all editions, and that differentiation should be more about hardware size and resource supported.

Well Sql 2016 sp1 now brings us close to that wish so its a really big deal for the SMB and mid market customer.

Once you have an application using SQL Server 2016 Standard Edition, you can just do an Edition Upgrade to Enterprise Edition to get even more scalability and performance, and take advantage of the higher license limits in Enterprise Edition. You will also get the intrinsic performance benefits that are present in Enterprise Edition.

The table compares the list of features which were only available in Enterprise edition, which are now enabled in Standard, Web, Express, and LocalDB editions with SQL Server 2016 SP1. This consistent programmatically surface area allows developers and ISVs to develop and build applications leveraging the following features which can be deployed against any edition of SQL Server installed in the customer environmen

This is a bold move by Microsoft, and should increase Standard sales, and customer satisfaction, without cannibalizing Enterprise sales. Standard Edition customers can use these features both to consolidate their codebases and, in many scenarios, build solutions that offer better performance.

There are many of new features available across all editions of SP1.
There still differences in Enterprise:

Availability features like: online operations, piecemeal restore, and fully functional Availability Groups (e.g. read-only replicas) are still Enterprise only.

Performance features like parallelism still don’t work in Express Edition (or LocalDB).

Automatic indexed view usage without NOEXPAND hints, and high-end features like hot-add memory/CPU, will continue to be available only in Enterprise.

Operational features like: Resource Governor, Extensible Key Management (EKM), and Transparent Data Encryption will remain Enterprise Edition only.

Others, like Backup Encryption, Backup Compression, and Buffer Pool Extension, will continue to work in Standard, but will still not function in Express.

SQL Server Agent is still unavailable in Express and LocalDB. As a result, , Change Data Capture will not work. Cross-server Service Broker also remains unavailable in these editions.

In-Memory OLTP and PolyBase are supported in Express, but ere unavailable in LocalDB.

Virtualization Rights haven’t changed and are still much more valuable in Enterprise Edition with Software Assurance.

Resource limits on the lower level editions remain the same. The upper memory limit in Standard Edition, is still 128 GB (while Enterprise Edition is now 24 TB).

I feel that Standard Edition is expensive enough that its memory limits should never be so dangerously close to the upper bound of a well-equipped laptop and maybe we should expect the limit to increase at least with each new version. If you when you are on Standard Edition and scale is required, then you can now use many Enterprise features across multiple Standard Edition boxes or instances, instead of trying to scale up.

All the newly introduced Trace flags with SQL Server 2016 SP1 are documented and can be found at http://aka.ms/traceflags.

SP1 contains a roll-up of solutions provided in SQL Server 2016 cumulative updates up to and including the latest Cumulative Update – CU3 and Security Update MS16–136 released on November 8th, 2016. Therefore, there is no reason to wait for SP1 CU1 to ‘catch–up‘ with SQL Server 2016 CU3 content.

The SQL Server 2016 SP1 installation may require reboot post installation

Microsoft Dynamics 365 now available in the U.A.E. – ask Synergy Software Systems

November 1st, 2016

Microsoft Dynamics 365 is a suite of cloud services to help companies to accelerate their digital transformation with purpose-built apps to address specific business needs.

Dynamics 365 unifies CRM and ERP functions into applications that work smoothly together across all divisions: sales, customer service, field service, operations, financials, marketing, and project service automation. These apps can be easily and independently deployed and scaled on demand.

Start with what you need the most.

All apps are delivered through easy-to-use, mobile experiences and feature offline capabilities.  

Users can rely on Power BI, Cortana Intelligence and Azure IoT functions which are natively embedded.

In addition to that, Dynamics 365 and Office 365 are deeply integrated.  Since Dynamics 365 uses a new common data servicel, customers can extend functionality and build custom apps using PowerApps, Microsoft Flow (News: PowerApps and Flow available) as well as professional developer solutions.

To find out more call us now on 00971 43365589.