Archive for the ‘Security and Compliance’ category

Windows Server 2008 and 2008 R2 support will end January 14, 2020- ask Synergy Software Systems about options.

November 16th, 2019

On January 14, 2020, support for Windows Server 2008 and 2008 R2 will end. Only 2 months away
That means the end of regular security updates.

Don’t let your infrastructure and applications go unprotected.

We’re here to help you migrate to current versions for greater security, performance and innovation.
009714 3365589

Azure Arc in preview manage hybrid data across cloud platforms……

November 16th, 2019

Now in preview, Azure Arc helps simplify enterprise distributed environments by managing everything via Azure services (like Azure Resource Manager). Connecting hybrid infrastructure via Azure Arc improves security for users via automated patching, and provides improved governance, with everything ‘under one roof’. Azure Arc, a tool that lets organizations manage their data on: the Microsoft Azure cloud, Amazon Web Services (AWS), Google Cloud Platform or any combination.

Microsoft says that deployments can be set up “in seconds” via Azure data services anywhere, a feature of Azure Arc.

Azure Arc also supports Kubernetes clusters and edge infrastructures, as well as on-premises Windows and Linux servers.
No final release date yet but there is a free preview of Azure Arc .


Microcode BIOS Updates coming from a Microsoft Update

November 13th, 2019

Intel Microcode Updates coming from a Microsoft Update or the Windows Catalog.
The security implications of why you should update the microcode on your processors are covered in these links

https://support.microsoft.com/en-us/help/4093836/summary-of-intel-microcode-updates

https://www.intel.com/content/www/us/en/security-center/advisory/intel-sa-00233.html

https://www.amd.com/en/corporate/product-security

Microsoft is collaborating with Intel and AMD on these microcode updates.

When processors are manufactured, they have a baseline microcode baked into their ROM. This microcode is immutable and cannot be changed after the processor is built. Modern processors have the ability at initialization to apply volatile updates to move the processor to a newer microcode level. However, as soon as the processor is rebooted, it reverts back to the microcode baked into their ROM. These volatile updates can be applied to the processor one of two ways – System Firmware/BIOS via OEM and by the Operating System (OS). However, neither updates the microcode in the processors ROM. If you were to remove the processor from one computer and to install in a computer with an older System Firmware/BIOS and an un-updated OS, then you will again be vulnerable.

Windows offers the broadest coverage and quickest turnaround time to address these vulnerabilities. Microcode updates delivered via the Windows OS are not new; as far back as 2007 some updates were made available to address performance and reliability concerns.

You could jus take the OEM System Firmware/BIOS Updates, but often Microsoft Update hasthe microcode updates to address issues much sooner.

When the processor boots, it has versioning to make sure it is utilizing the latest microcode updates regardless of from where it came. Install of System Firmware/BIOS updates and microcode updates from Microsoft Update is therefore O.K. It is possible that the OEM updates the microcode to one level and the OS updates the microcode to an even higher level during the same boot.

Microcode updates install like any other update. They can be installed from Microsoft Update, WSUS, SCCM or manually installed if downloaded from the Catalog. The key difference is that the payload of the hotfix is primarily one of two files:

mcupdate_GenuineIntel.dll – Intel
mcupdate_AuthenticAMD.dll – AMD

These files contain the updated microcode and Windows automatically loads these via OS Loader to patch the microcode on the boot strap processor. This payload is then passed to additional processors as they startup as well the Hyper-V hypervisor if enabled.

Enhanced HA and DR benefits for SQL Server Sofware Assurance from 1 November.

November 5th, 2019

The enhanced benefits to SQL licensing for high availability and disaster recovery that are listed below are now applicable to all releases of SQL Server for a customer with SQL Server licenses with Software Assurance. The updated benefits will be available in the next refresh of the Microsoft Licensing Terms.

Business continuity is a key requirement for planning, designing, and implementing any business-critical system. When you bring data into the mix, business continuity becomes mandatory. It’s an insurance policy that one hopes they never have to make a claim against in the foreseeable future. SQL Server brings intelligent performance, availability, and security to Windows, Linux, and containers and can tackle any data workload from BI to AI from online transaction processing (OLTP) to data warehousing. You get mission-critical high availability and disaster recovery features that allow you to implement various topologies to meet your business SLAs.

A customer with SQL Server licenses with Software Assurance has historically benefited from a free passive instance of SQL Server for their high availability configurations. That helps to lower the total cost of ownership (TCO) of an application using SQL Server. Today, this is enhanced for the existing Software Assurance benefits for SQL Server which further helps customers implement a holistic business continuity plan with SQL Server.

Starting Nov 1st, every Software Assurance customer of SQL Server will be able to use three enhanced benefits for any SQL Server release that is still supported by Microsoft:
• Failover servers for high availability – Allows customers to install and run passive SQL Server instances in a separate operating system environment (OSE) or server for high availability on-premises in anticipation of a failover event. Today, Software Assurance customers have one free passive instance for either high availability or DR
• Failover servers for disaster recovery NEW – Allows customers to install and run passive SQL Server instances in a separate OSE or server on-premises for disaster recovery in anticipation of a failover event
• Failover servers for disaster recovery in Azure NEW – Allows customers to install and run passive SQL Server instances in a separate OSE or server for disaster recovery in Azure in anticipation of a failover event

With these new benefits, Software Assurance customers can implement hybrid disaster recovery plans with SQL Server using features like Always On Availability Groups without incurring additional licensing costs for the passive replicas.

A setup can use SQL Server running on an Azure Virtual Machine that utilizes 12 cores as a disaster recovery replica for an on-premises SQL Server deployment using 12 cores. In the past, you would need to license 12 cores of SQL Server for the on-premises and the Azure Virtual Machine deployment. The new benefit offers passive replica benefits running on an Azure Virtual Machine. Now a customer need to only license 12 cores of SQL Server running on-premises as long as the disaster recovery criteria for the passive replica on Azure Virtual Machine is met.

If, the primary. or the active replica. uses 12 cores hosting two virtual machines and the topology has two secondary replicas: one sync replica for high availability supporting automatic failovers and one asynchronous replica for disaster recovery without automatic failover then . the number of SQL Server core licenses required to operate this topology will be only 12 cores as opposed to 24 cores in the past.

These high availability and disaster recovery benefits will be applicable to all releases of SQL Server. In addition to the high availability and disaster recovery benefits, the following operations are allowed on the passive replicas:
• Database consistency checks
• Log backups
• Full backups
• Monitoring resource usage data

SQL Server 2019 also provides a number of improvements for availability, performance, and security along with new capabilities like the integration of HDFS and Apache Spark™ with the SQL Server database engine.

SnapLogic iPasS integration as a service – from Synergy Software Systems.

October 20th, 2019

Business Intelligence Managers/Analysts, Data/ETL Engineers, and Information/Data Architects are tasked with empowering business users to make use of
data to drive smart decisions and innovations. Data-driven initiatives can be challenging considering the explosion of data volumes due to the proliferation of sensors, IoT, and mobile computing.

Moreover, a growing number of groups within the business want access to fresh data.

To fully harness their data, organizations must also have a cloud strategy for their digital transformation efforts, namely to migrate data from
on-premises environments to the cloud. Considering the tremendous business value of unlocking that data, it’s imperative to prioritize and streamline these
data integration and migration projects.

Gone are the days when IT needed hundreds of coders to build extract, transform, load (ETL) solutions and then maintain those by writing more code. Modern integration platforms eliminate the need for custom coding. Now, data integration projects deploy and scale, often as much as ten times faster.

iPaaS platforms ease the pain because they’re designed for flexibility and ease of deployment for any integration project. A drag-and-drop UX coupled with a powerful platform and hundreds of pre-built connectors out of the box.

The connectors are always up-to-date, so the IT organization doesn’t spend an inordinate amount of time maintaining every integration by hand. This saves an incredible amount of time, money, and frustration across the team and projects and greatly reduces risk.

Not all integration platforms are created equal. Some do simple point-to-point cloud app integrations while others transform large and complex data into a data lake for advanced analytics. Some stgill require extensive developer resources to hand-code APIs while others provide self-service, drag-and-drop offerings that can be used by IT and business leaders alike. Some are best for specific tactical projects while others provide a strategic, enterprise-wide platform for multi-year digital transformation projects.

Organizations must address four key steps during the data migration and integration process:
1. Capture data that supports both the known use cases as well as future undefined use cases (think IoT data to support a future machine learning
enabled use case).
2. Conform inbound data to corporate standards to ensure governance, quality, consistency, regulatory compliance, and accuracy for downstream
consumers.
3. Refine data for its eventual downstream application and/or use cases (once its been captured and conformed to corporate standards).
4. Delivery of data needs to be broad and prepared to support future unknown destinations.

For decades, IT has been tasked to manage integration projects by writing tons of custom code. This onerous task is even more complex with the proliferation of SaaS applications, the surge in big data, the emergence of IoT, and the rise of mobile devices. IT’s integration backlog has exploded. Not only is the deployment too much work, but there is a growing cost to maintain all of the integrations.

Deploying a tactical or departmental data warehouse solution should take days, not months. Moreover, enterprise-wide data transformation projects should take months, not years.

The best data integration platforms:
- Support multiple app and data integration use cases across cloud, on-premises, and hybrid deployments
- Offer the flexibility to be used in cloud, hybrid, or on-premises environments, regardless of the execution location
- Provide a self-service user experience aided by AI, machine learning, hundreds of pre-built connectors, and integration pipeline
templates (patterns) resulting in greater user productivity, and faster time-to-integration
- Have an underlying, scalable architecture to grow with evolving data and integration requirements
- Support different data modes such as streaming, event-driven, real-time or batch

The SnapLogic iPaaS offering is functionally rich and well-proven for a variety of use cases. It supports hybrid deployments and provides rich and differentiating features for analytics and big data integration (Hadooplex). Clients score SnapLogic as above average for cloud characteristics, functional completeness, ease of use and ability to meet SLAs.” Gartner

SnapLogic is a U.S.-based integration platform company. In mid-2013, it transitioned from a traditional software business to an iPaaS model with the release of the SnapLogic Elastic Integration Platform which provides a large set of native iPaaS capabilities that target the cloud service integration, analytics and big data integration use cases.

The flagship Enterprise Edition features a set of base adapters (Snaps), an unlimited number of connections and unlimited data volume.

Synergy Software Systems has been an Enterprise Solutions Integrator in the GCC since 1991. We are pleased to announce our formal partnership to represent Snap Logic in the MEA region.

Do you need to integrate with Azure? with SAP Data Warehouse Cloud? with Workday? With Odette compliant auto mamufacturers………..?.

To learn more call us on 009714 3365589

SQL Server 2016 Service Pack 2 SP2 CU9 release

October 1st, 2019

Cumulative Update package 9 (CU9) (build number: 13.0.5470.0) for Microsoft SQL Server 2016 Service Pack 2 SP2 is now aavilablefor download. (It contains fixes that were released after the initial release of SQL Server 2016 SP2.)
SQL Server CUs are certified to the same levels as Service Packs, and should be installed at the same level of confidence. Historical data shows that a significant number of support cases involve an issue that has already been addressed in a released CU.

The CU provides the following fixes and improvements (Referenced from https://support.microsoft.com/en-us/help/4100997/cumulative-update-9-for-sql-server-2016-sp1)

KB4099472 – PFS page round robin algorithm improvement in SQL Server 2016 SQL service
KB4133164 – FIX: Error when a SQL Server Agent job executes a PowerShell command to enumerate permissions of the database Management Tools
KB4086173 – FIX: Access violation occurs when executing a DAX query on a tabular model in SQL Server Analysis Services Analysis Services
KB4131193 – Performance issues occur in the form of PAGELATCH_EX and PAGELATCH_SH waits in TempDB when you use SQL Server 2016 SQL service
KB3028216 – FIX: A crash occurs when proactive caching is triggered for a dimension in SSAS Analysis Services
KB4135113 – FIX: Change tracking record is inconsistent during an update on a table which has a cluster/unique index in SQL Server SQL service
KB4293839 – FIX: TDE database goes offline during log flush operations when connectivity issues cause the EKM provider to become inaccessible in SQL Server SQL security
KB4230730 – FIX: A dead latch condition occurs when you perform an online index rebuild or execute a merge command in SQL Server SQL service
KB4163478 – FIX: An access violation occurs when incremental statistics are automatically updated on a table in SQL Server SQL performance
KB4230306 – FIX: Restore of a TDE compressed backup is unsuccessful when using the VDI client SQL service
KB4163087 – FIX: Performance is slow for an Always On AG when you process a read query in SQL Server SQL service
KB4164562 – FIX: Wrong user name appears when two users log on to MDS at different times in SQL Server Data Quality Services (DQS)
KB4094893 – FIX: Database cannot be dropped after its storage is disconnected and reconnected in SQL Server SQL service
KB4162814 – FIX: An internal exception access violation occurs and the SSAS server stops responding Analysis Services
KB4134541 – FIX: Error in the MDS Add-in for Excel when you use the German version of Excel in SQL Server Data Quality Services (DQS)
KB4132267 – FIX: Deploying a SSAS project in SSDT is frequently unsuccessful in SQL Server Analysis Services in Tabular mode Analysis Services
KB4101554 – FIX: Parallel redo in a secondary replica of an availability group that contains heap tables generates a runtime assert dump or the SQL Server crashes with an access violation error High Availability
KB4098762 – FIX: Hidden parameters are included in reports when the Browser role is used in SSRS 2016 Reporting Services
KB4134175 – FIX: Processing a cube with many partitions generates lots of concurrent data source connections in SSAS Analysis Services
KB4091245 – FIX: Access violation occurs when you query a table with an integer column in SQL Server 2017 and SQL Server 2016 SQL performance
KB4094706 – FIX: One worker thread seems to hang after another worker thread is aborted when you run a parallel query in SQL Server SQL service
KB4058175 – FIX: TDE enabled database backup and restore operations are slow when the encryption key is stored in an EKM provider in SQL Server SQL service
KB4131960 – FIX: An access violation occurs when you execute a nested select query against a columnstore index in SQL Server SQL Engine
KB4094858 – FIX: “An unexpected error occurred” when you use DAX measures in Power BI table visualizations in SQL Server Analysis Services
KB4101502 – FIX: TDE enabled database backup with compression causes database corruption in SQL Server 2016 SQL service

CUs also often include supportability, manageability, and reliability updates.

Before udpate:

- Check compaitbiltiy with your application.
- Test CUs before you deploy to production environments.

Azure Data Share in preview

August 29th, 2019

A brand new product by Microsoft called Azure Data Share was recently announced and is in public preview. It is a simple way to control, manage, and monitor all of your data sharing.

Data which resides in Azure storage can be securely shared between a data provider and a data consumer by copying a snapshot of the data to the consumer’s subscription (called snapshot-based copying, and in the future there will be in-place sharing). It supports ADLS Gen1, ADLS Gen2, and Blob storage, and eventually will support Azure Data Explorer, SQL DB, and SQL DW.

With a few clicks share data with another user that has access to an Azure Subscription and storage account. Dat is copied, and updated and is encrypted during transit. Specify the frequency at which the data consumers receive updates.
- With Azure Data Share, as a data provider, you provision a data share and invite recipients to the data share.
- Data consumers receive an invitation to your data share via e-mail. Once a data consumer accepts the invitation, they can trigger a full snapshot of the data shared you shared them.
- This data is received into the data consumers storage account. Data consumers can receive regular, incremental updates to the data shared with them so that they always have the latest version of the data.
- When a data consumer accepts a data share, they are able to receive the data in a storage account of their choosing. For example, if the data provider shares data using Azure Blob Storage, the data consumer can receive this data in Azure Data Lake Store.

Azure Data Share also provides a way for companies to monetize some of theirvaluable internal data. With all the work done to build a modern data warehouse, why not sell the data to partners and business customers? They will save a ton of time and money trying to create the same data. This needs to be considered in the context of privacy laws like GDPR.

.

Gartner recognized SnapLogic as a Visionary in its Data Integration Magic Quadrant

August 7th, 2019

Gartner recognized SnapLogic as a Visionary in its Data Integration Magic Quadrant! This comes on the heels of being recognized as a Leader in three top analyst reports for the best integration platform as a service (iPaaS) solutions – the Gartner Magic Quadrant, Forrester Wave, and G2 Crowd Grid.
We believe these collective recognitions testify to the fact that SnapLogic is unrivaled when it comes to integrating cloud applications and on-premises data in one unified platform.

Gartner commended SnapLogic for:
• Our powerful integration convergence and augmented data integration delivery
• Our easy accessibility to diverse user personas
• Our pricing model simplicity and trial version

Synergy Software Systems is a Middle East partner. . This solution speeds up deployment of complex solutions with multiple jntegrations and significantly improves and simplifies the management and maintenance of integrations.

Whether for EDi to Odette standards for the automotive sector, or for streaming high volumes of data, or for ETL processes to bring data from multiple, enterprise systems into a data lake or Enterprise BI or Corporate performance management system, Snap Logic provides a multitude of pre built “Snap integrations: for a low code, configuration approach to integration.

Synergy Software Systems has provided integrated solutions in the region. Digital revolution is proving new opportunities and challenges. Robotic Processes Automation, Predictive analytics, ML AI, IoT, RFID, cloud services, data lakes, and mobility are now standard components of any solution. However digital revolution also requires agility and rapid robust deployment and ease of update and maintenance. Integration ETL, and streaming data from multiple systems at enterprise scale needs a new ‘productized’ low code approach to integration.

Snaplogic is a key tool for successful agile deployment of Enterprise integration, Corporate Performance management, EDI, BI and RPA solutions.

There are already major clients deploying Snap Logic in the UAE.

To learn more . Call us on 00971 43365589

SQL Server 2016SP2 Cumulative Update 8

August 3rd, 2019

The urgent security update earlier this month is not the only patch for SQL Server 2016 in July,
Microsoft has released SQL. SP2 CU8 (build number: 13.0.5426.0)
• Restores of compressed encrypted backups fail
• Data masking doesn’t
• DAXquery needs memory 200x larger than the database size
• Peer-to-peer replication fails when your host name isn’t uppercase
• QueryStore cleanup can fill the transaction log and cause an outage
•DistributedAvailability Groups cause memory dumps when automatic seeding
• AGreplication stops working due to internal thread deadlocks
•The deadlock monitor can cause an access violation
• Query a view with a union on a linked server,
• Concurrent inserts into a clustered columnstore index can deadlock
•Infiniteloop when FileTable is used for a long time without a restart
•SSAS2016 randomly crashes ( maybe not completely random if they fixed it)
•TransparentData Encryption doesn’t encrypt if it’s restarted mid-encryption

And much more.https://support.microsoft.com/en-us/help/4505830/cumulative-update-8-for-sql-server-2016-sp2

I guess we will get a similar patch for Sp1 but by now you should be on a later patch

Office 365 will retire TLS 1.0 and 1.1 starting June 1st, 2020

July 24th, 2019

To provide best-in-class encryption, and to ensure the service is more secure by default, Microsoft is moving all of its online services to Transport Layer Security (TLS) 1.2+

Office 365 will be retiring TLS 1.0 and 1.1 starting June 1, 2020. This means that all connections to Office 365 using the protocols TLS 1.0 and TLS 1.1 will not work so prior to June 1, 2020.

Plan to replace clients and devices that rely on TLS 1.0 and 1.1 to connect to Office 365.

The TLS protocol aims primarily to provide privacy and data integrity between two or more communicating computer applications. It is an IETF standard intended to prevent eavesdropping, tampering and message forgery. Transport Layer Security (TLS), and the deprecated predecessor, Secure Sockets Layer (SSL), are cryptographic protocols that provide communications security over a computer network and the protocols find are uses in applications such as: web browsing, email, instant messaging, and voice over IP (VoIP). Websites use TLS to secure all communications between their servers and web browsers. The latest version – TLS 1.3 – is an overhaul that strengthens and streamlines the crypto protocol.

The work on TLS1.3 started in April 2014, and it took four years and 28 drafts before it was approved in March of 2018. Version 1.3 makes the handshake process faster by speeding up the encryption process. This has a security benefit, and will also improve performance of secure web applications. With TLS 1.2, the handshake process involved several round trips, whereas with 1.3 only one round is required, and all the information is passed at that time. In addition to security improvements, TLS 1.3 eliminated a number of older algorithms that did nothing other than create vulnerabilities.The updated protocol added a function called “0-RTT resumption” that enables the client and server to remember if they have communicated before.

The PCI compliance standards require that any site accepting credit card payments uses TLS 1.2 after June 30, 2018 Services such as PayPal, Authorize.net, Stripe, UPS, FedEx, and many others already support TLS1.2, and have announced that they will eventually refuse TLS 1.0 connections. This means your safest action is to upgrade to TLS 1.2+/3 sooner than later to avoid disruption. It also likely to be a consideration for GDPR compliance in the event of a breach if using an older protocol.

Windows 7 exploit- critical fix July 2019

July 16th, 2019

Microsoft’s latest SSU helps fix a bug in Secure Boot that interferes with Windows’ BitLocker encryption system. The updates are available from the Microsoft Update Catalog or through Windows Server Update Services (WSUS).

Microsoft said it “strongly recommends” that users and admins install this latest SSU before installing the latest cumulative update, which was released along with this month’s Patch Tuesday updates. This month’s updates brings a fix for a Win32k zero-day, marked as CVE-2019-1132, which was part of an attack used by Kremlin-backed hackers. The researcher at ESET, Anton Cherepanov, found the exploit for the flaw which doesn’t affect Windows 10 or Windows 8 but it does impact older versions including Windows 7 SP1, Windows Server 2008 SP2, and Windows Server R2 SP1. Cherepanov noted that the technique used in the current exploit is “very similar” to one used before 2017 by the advanced hacking group called Sednit, aka Fancy Bear, APT28, STRONTIUM, and Sofacy. Windows 8 and later block a key component of the exploit chain, which is why the flaw only affects earlier versions of supported Windows versions. He notes that Microsoft back-ported the Windows 8 mitigation to Windows 7 for x64-based systems.

Bugs like this are one reason Windows 7 users should follow Microsoft’s advice to upgrade. Those who still use Windows 7 for 32-bit systems Service Pack 1 should update to newer operating systems, since extended support of Windows 7 Service Pack 1 ends on January 14, 2020. Which means that Windows 7 users will then no longer receive critical security updates. Thus, vulnerabilities like this one will stay unpatched forever.

This is not the only fix – the Microsoft patches address 77 security flaws, including 15 rated “critical.”
In May this year patches were also released for BlueKeep’s – the ability to automatically spread from one vulnerable machine to another – could be exploited in an attack on the same global scale as WannaCry, whose worm capabilities were enabled by EternalBlue, the leaked NSA exploit for the SMBv1 file-sharing protocol. The NSA urged admins to patch the flaw and change configurations to prevent potential attacks. Its warning followed research that found that at least one million Windows computers were still vulnerable to BlueKeep. The NSA said it was “likely only a matter of time” before attacks emerged.

Windows 7 updates July 2019

July 16th, 2019

Last week there were Windows Updateof security and reliability fixes for Windows 7 as part of the normal Patch Tuesday delivery cycle for every version of Windows. icrosoft split its monthly update packages for Windows 7 and Windows 8.1 into two distinct offerings: a monthly rollup of updates and fixes and, for those who are want only those patches that are absolutely essential, a Security-only update package. Under Microsoft’s rules, what it calls “Security-only updates” are supposed to include,only security updates, not quality fixes or diagnostic tools. However, this month’s Security-only update, the “July 9, 2019—KB4507456 (Security-only update),” bundled in the Compatibility Appraiser, KB2952664, which is designed to identify issues that could prevent a Windows 7 PC from updating to Windows 10.

The concern is that these components are being used to prepare either for another round of forced updates or to spy on individual PCs. The word telemetry appears in at least one file, and for some it seems to be a short step from innocuous data collection to spyware. Microsoft appeared to be surreptitiously adding telemetry functionality to most of its solutions. Microsoft has slipped this functionality into a security-only patch without any warning, thus adding the “Compatibility Appraiser” and its scheduled tasks (telemetry) to the update. The package details for KB4507456 say it replaces KB2952664 (among other updates). So this is not a security-only update.

The Appraiser tool was offered via Windows Update, both separately and as part of a monthly rollup update two years ago; as a result, most of the declining population of Windows 7 PCs already has it installed. Given the headaches users faced over unwanted upgrades back in Windows 10′s first year why is Microsoft reluctant to talk about security issues except in formal settings like release notes and support bulletins.

This has already been an exhausting week thanks to a pair of Windows 10 zero-day exploits being used in the wild, by Kremlin-backed hackers.

Windows 10 19H2 release

July 16th, 2019

The 19H2 release of Windows 10, which will probably be called the Windows 10 October 2019 Update, will not include a list of new user-facing features. Instead, it will deliver “select performance improvements, enterprise features and quality enhancements.”

This update “will install like a monthly update” on PCs that are running the latest Windows 10 release, version 1903. In other words its what we would call a service pack even if Microsoft no longer does. Devices on any currently supported version of Windows 10 will only need to reboot once to update them to 19H2. The 19H2 release will be fully supported for 30 months. While still n aggressive update schedule for some IT departments that is a lot easier to live with than 6 monthly updates. (The update is the last Windows 10 release before the end of free support for Windows 7 on January 14, 2020. )

For OEM and retail Windows editions, even Windows 10 Home, feature updates are no longer immediately mandatory. The twice-yearly feature updates are offered on PCs that Microsoft’s algorithms deem suitable; but the feature update is to be offered as an optional update that the PC’s owner has to approve manually. You’re can ignore that prompt for as long as the current version is supported, or a maximum of 18 months.

For businesses with PCs running Windows 10 Pro, the updates are delivered with the same 18-month support cycle. The difference is that administrators can defer monthly cumulative updates by up to 30 days and can defer feature updates by up to 365 days. On a PC with Windows 10 Settings app or applied Group Policy to defer feature updates, the option to update to the next release doesn’t appear at all until the deferral period ends or the current version reaches its end-of-support. Companies that run Windows 10 Pro should plan for an annual Windows 10 feature update – any .than 12 months, but and you may hit an end-of-support date and a forced feature update.

Customers running Windows 10 Enterprise and Education get the longest support calendar, \. The March updates will have an 18-month support cycle for all editions, whereas the September release will get the longer, 3 install version 1903 late in 2019 and plan to install the 19H2 release as a lightweight update when it’s ready. With that “service pack” in place, they can leave those PCs alone for two full years, until the second half of 2021.0-month support cycle for Enterprise and Education editions. (All Windows 10 Pro releases are supported for 18 months.)

To ensure updates don’t happen at the wrong time see this post:

https://www.techrepublic.com/article/how-to-control-updates-in-windows-10/?ftag=CMG-01-10aaa1b

P.S. Dark mode to reduce eye strain MacOS got dark mode last year in Mojave, . Android also got a dark mode setting last year, and the upcoming Android Q will make it easy to turn on. You can similarly dim the lights in Windows 10 = Go to Settings, tap Personalization, tap Colors and then under Choose your default app mode, choose Dark.

GDPR enforcement be aware of what it means to you

July 15th, 2019

http://www.enforcementtracker.com/

Reports that in Germany there have already been 101 fines made public worth 484.900 EUR. As well as recent high profile fines recently covered in this blog there many other actions reported on this site.

Some examples

France: SERGIC, a company specialized in real estate development, purchase, sale, rental and property management
The two key reasons were lack of basic security measures and excessive data storage Sensitive user documents uploaded by rental candidates (including ID cards, health cards, tax notices, certificates issued by the family allowance fund, divorce judgments, account statements) were accessible online without any authentication procedure in place.
Although the vulnerability was known to the company since March 2018, it was not finally resolved until September 2018. In addition, the company stored the documentation provided by candidates for longer than necessary. The CNIL took into account. the seriousness of the breach (lack of due care in addressing vulnerability and the fact that the documents revealed very intimate aspects of users’ lives), the size of the company and its financial standing.

Google – The fine was imposed on the basis of complaints from both: the Austrian organisation “None Of Your Business” , and the French NGO “La Quadrature du Net” that concerned the creation of a Google account during the configuration of a mobile phone using the Android operating system. The CNIL imposed a fine of 50 million euros for lack of transparency (Art. 5 GDPR), insufficient information (Art. 13 / 14 GDPR) and lack of legal basis (Art. 6 GDPR)

UNIONTRAD COMPANY – Complaints were made by several employees of the company who were filmed at their workstation. This was in breach of rules to be observed when installing cameras in the workplace, in particular, that employees should not be filmed continuously and that information about the data processing has to be provided. In the absence of satisfactory measures at the end of the deadline set in the formal notice, the CNIL carried out a second audit in October 2018 which confirmed that the employer was still breaching data protection laws when recording employees with CCTV.

Austria – A fine was imposed against a private person who was using CCTV at his home. The video surveillance covered areas intended for the general use of the residents of the multi-party residential complex: parking lots, sidewalks, courtyard, garden and access areas to the residential complex; and the video surveillance covered garden areas of an adjacent property. The video surveillance subject of the proceedings was therefore not limited to areas which are under the exclusive power of control of the controller. Video surveillance is therefore not proportionate to the purpose and not limited to what is necessary. The video surveillance records the hallway of the house and films residents entering and leaving the surrounding apartments, thereby intervening in their highly personal areas of life without the consent to record their image data.

Romania – WORLD TRADE CENTER BUCHAREST SA - A printed paper list used to check breakfast customers, contained personal data of 46 clients who stayed at the hotel’s WORLD TRADE CENTER BUCHAREST SA and was photographed people outside the company, which led to the disclosure of the personal data of some clients through online publication. The operator of WORLD TRADE CENTER BUCHAREST SA was sanctioned because it has not taken steps to ensure that data was not disclosed to unauthorized parties.

Hungary a fine was imposed on an unnamed financial institution for unlawfully rejecting a customer’s request to have his phone number erased after arguing that it was in the company’s legitimate interest to process this data in order to enforce a debt claim against the customer. In its decision, the NAIH emphasised that the customer’s phone number is not necessary for the purpose of debt collection because the creditor can also communicate with the debtor by post. Consequently, keeping the phone number of the debtor was against the principles of data minimisation and purpose limitation. As per the law, the assessed fine was based on 0.025% of the company’s annual net revenue.

Several countries issues fines related to misuses of data in elections.
Several countries issued fines to companies who did not respond to a request by an employee or customer about data that was held about them.

PwC’s own UK Privacy & Security Enforcement Tracker found that fines in the UK alone over data protection law violations totalled £6.5 million in 2018.

SQL Server 2008 and SQL Server 2008 R2 -OUT OF SUPPORT today

July 13th, 2019

SQL Server 2008 and 2008 R2, both of these versions of SQL server go out of extended support with Microsoft today 9th July 2019

Many companies and businesses are still SQL Server 2008 R2 and below. There can be a number of reasons for this, maybe the applications the databases support require an older version of SQL Server, maybe the applications are also coming to the end of life, but the end dates do not match up with the data platform end of support dates.

Sometimes applications are critical to the business and everything works just fine. The business doesn’t want to disrupt the application or introduce any risk by performing a migration to a new version so why change it?

In this situation your data platform is out of support completely. Out of support system attract hackers. Note the previous articles about fines for loss of privacy data to realise how serious this can be

So you should be making plans to migrate your legacy SQL Servers off the unsupported versions. It is likely if you are still on an old database that you are also on an old server and on an old version of Windows. That gives additional risk of failed hard disks, other system vulnerabilities – Meltdown, Spectre? Phishing…….
Investors and insurers are not likely to be sympathetic in such circumstances.

There are many performance and security benefits of upgrade.

If you decide to run on out support software and take the risk associated with running on out of support software. The main advantage of this approach is there is nothing immediate to do. The longer you run on the platform the greater the chances of you encountering a security vulnerability or failing a compliance test.
If anything does go wrong you’ll have no support from Microsoft.
Other software vendors support contracts may also require that you be on a currently supported database

Modernise and upgrade is one of the options that you have available.

You can upgrade your on premises SQL Server or migrate the databases to Azure either as IaaS solution where you run the VM in Azure or even the PaaS Azure SQL database offering

There are number of advantages to upgrading your data platform. You’ll be running your database workloads on an in support data platform, with a long support window. There will likely by new features in the latest and greatest version of SQL Server that you can use to add business value to your application – Availability Groups for example. Also you will likely find people with skills in the later technology, those skills will be more readily available in the jobs market.

There will likely be a different licensing model – the licensing model changed between SQL Server 2008 R2 and SQL Server 2012 – it possible you will have to pay more for you SQL Server licences.

The third option is instead of doing nothing you pay for a custom support agreement. The main advantage here is you can continue to get security updates and therefore potentially remaining compliant. The main disadvantage of this approach is the cost involved, which is typically 75% of the full license costs of the latest version of SQL Server and Windows Server.

Migrate workload to Azure. Microsoft allow SQL Server 2008 and SQL Server R2 VMs running in Azure to have the security updates for free for a further 3 years. So you can migrate your database server to azure and continue to get security updates for free until 2022.

The main advantage of this is you get to keep running the same version of the OS and Data platform, the security updates are free so the cost is minimal \. The disadvantages is you would need to move off premises, if this is not an option for you then you can’t exercise this option and there will still be work in involved in ‘lifting and shifting’ the VM to the cloud.

Whatever you do when support ends for SQL Server 2008 and SQL Server 2008 R2 have a plan