Archive for the ‘Security and Compliance’ category

SQL Server 2008 and SQL Server 2008 R2 – end of life July 9, 2019 -ask Synergy Software Systems

June 23rd, 2019

Microsoft has previously announced that SQL Server 2008 and SQL Server 2008 R2 will reach end of life on July 9, 2019.

This means that in less than a month, Microsoft will no longer release regular security updates for the product.

There are several reasons this is important to you.
• Attacks against software products of all types are common and ongoing. With Microsoft SQL being such a prevalent platform, attacks against it are ubiquitous, and it’s important to keep your database platform up-to-date with the latest Microsoft security patches.
• Many compliance requirements dictate that you must be running currently supported software.
• As Microsoft drops support for a product, many third-party applications may also discontinue support for their products running on those platforms.

So, if you are still running SQL Server 2008/2008 R2, then what are your options?

1.Upgrade to a newer version of SQL.
SQL 2019 is in preview release as of this writing, so the current production version of SQL Server is 2017. Its end of life will be October 12, 2027.
Evaluate your applications and databases to make sure they are compatible e.g. Dynamic Ax 2012 is not supported beyond SQL 2016

Plan a migration for either on-premises or cloud. A move to an Azure SQL Database Managed Instance, will not require you to upgrade in the future. By choosing this option, you will also gain access to new features which have appeared in the latest SQL Server versions. However, it only offers subset of SQL features so you need to be sure it will support your application and use.

2.Migrate to Azure to receive three more years of Extended Security Updates for SQL Server 2008/2008 R2. If you need to stay on the same SQL code base for a bit longer, Microsoft will allow you to rehost your SQL 2008 environment in Azure and still provide you with security updates for an extended period. There is no extra cost for the extended updates beyond the standard Azure VM rates.

3.Purchase extended support. Microsoft allows customers with an active Enterprise Agreement and Software Assurance subscription to purchase and receive three years of Extended Security Updates for SQL Server 2008/2008 R2. The annual outlay for the updates is 75% of the full license cost.

4.The least desirable option is to stay where you are and pray. If circumstances prevent you from moving forward now, then at minimum you should:
• Recognize and account for the risk;
•Plan and budget for a transition as soon as possible;
•Re-evaluate your security and tighten it as much as possible.

Microsoft provides guidance for handling the end of support of SQL Server 2008/2008 R2 at https://www.microsoft.com/2008-eos.

Of course, Synergy is ready to help you to evaluate and to progress to the next level. 0097143365589

If you are running newer versions of SQL Server, then here are their End-of-Life dates.
•SQL Server 2012 – July 12, 2022
•SQL Server 2014 – July 9, 2024
•SQL Server 2016 – July 14, 2026
•SQL Server 2017 – October 12, 2027

UAE and AI

June 20th, 2019

A report commissioned by Microsoft and conducted by EY says the UAE has seen the second highest AI investment over the past decade, more than USD $2.15 billion
• One in five companies in the country consider AI as their top digital priority
• 94% of C-suite leadership consider ‘AI strategy’ as an important topic and 35% of non-managerial staff are actively having AI discussions

New research shows the state of AI within businesses across the UAE is expected to improve dramatically over the next three years,as a growing number of executives look to AI to drive their digital agendas. Already, 18% of businesses in the country consider AI their most important digital priority. (AI Maturity Report in the Middle East and Africa (MEA) Click here – a new study commissioned by Microsoft and conducted by EY.)

The UAE’s progress in elevating the AI agenda is a direct result of leaders across the country recognising that the technology is a key differentiator across all sectors. 94% of companies in the UAE report involvement in AI at executive management level – the highest percentage of any surveyed country in MEA.

“When we examine companies with high AI maturity, it’s clear that the technology is driven directly by the CEOs themselves. This high level of involvement typically results in greater investment in AI, broader adopti on and a greater number of successful implementations,” says Sayed Hashish, regional general manager at Microsoft Gulf.

Leadership capability in the UAE is also rated high when compared with other countries in MEA. While 64% of respondents believe they have moderate, little or no AI leadership competency, 24% of executives in the UAE rated themselves as highly competent, with another 46 percent indicating they are either competent or very competent. Most companies still consider themselves to be in the planned phase of AI maturity, meaning AI has not yet been put to active use. On the opposite end of the spectrum, just 8% of businesses perceive themselves as advanced in their application of AI.

It’s not surprising that the UAE is the second highest regional investor in AI over the past ten years, investing $2.15 billion in total. The bulk of this investment went towards social media and Internet of Things (IoT) transactions. This was followed by notable spend across a further eight technologies, including smart mobile, gamification, and machine learning.

Machine learning is ranked as the most useful AI technology, with primary emphasis placed on decision support solutions, then smart robotics and text analysis, where customer interactions are the key focus.

The UAE’s open culture around AI is a highly positive indicator of the health of the technology within the country. 94% of UAE companies have ‘AI Strategy’ as an important topic at C-suite level and a significant 35% of companies say AI discussions are filtering down from top management right the way through to non-managerial levels. As a result, employees in the UAE embrace opportunities to participate in skills training and pilot programmes.

UAE companies are, in general, heavily focused on customer engagement when it comes to AI. The use of chatbots in the marketing space has become common, largely because they enhance the customer experience, ultimately demonstrating obvious value to management. UAE respondents expect AI to deliver greater operational efficiencies, drive down costs and, most importantly, enable them to be more competitive. Companies within the Emirates view prediction (76%) and automation (76%) as the most relevant applications of AI for their businesses.

65% of UAE companies rate themselves as highly to very competent when it comes to drawing on external alliances to strengthen their AI capabilities.

Synergy Software Systems offers Integration as a service, Robotic Process Automation, Machine learning, and Advanced analytics solutions

Databases breaches

June 18th, 2019

Verizon has published a Data Breach Investigations Report annually and the latest report is the 11th edition, and all are extremely well detailed. Not all data breaches are discovered, and those that are discovered aren’t necessarily reported. The 2018 report covers 53,000 incidents, defined as: A security event that compromises the integrity, confidentiality or availability of an information asset. . It also covers 2,216 breaches, which are defined as: An incident that results in the confirmed disclosure — not just potential exposure — of data to an unauthorized party.

These numbers ), do NOT include breaches involving botnets. The additional 43,000 successful accesses via stolen credentials associated with botnets are handled in a special insights section of the report.

Those are scary numbers.

The Verizon report show s 73% perpetuated by outsiders, 28% involving internal actors, 2% involving partners, 2% featuring multiple parties, 50% carried out by organized criminal groups, 12% involved actors identified as nation-state or state-affiliated. These figures are regarding those confirmed data breaches, not all security incidents. While 28% involve internal users, the bulk of data breaches were caused by from people outside the organization, using malware or social attacks, or exploiting vulnerabilities created due to errors.

While the exact internal actors weren’t found for all of the reported data breaches, analysis was done for 277 data breaches and a screen shows: 72 system admin, 62 end user, 62 other, 32 doctor or nurse, 15 developer, 9 manager, 8 executives

Database administrators may focus on denying permissions to developers for production, but developers proved much less likely to be involved in data breaches than system admins …which includes … the DBAs.

You don’t need production system access to cause a data breach. It’s common practice in an enterprise to make copies of production data for use by analysts, developers, product managers, marketing professionals, and others.

Privacy law compliance makes this all the more concerning.

Biometrics – privacy and security concerns

June 18th, 2019

On Monday last week a US Customs and Border Protection (CBP) subcontractor suffered a data breach that exposed the photos of tens of thousands of travelers coming in and out of the United States, through specific lanes at a single Port of Entry over a one and a half months period, in what was described as a “malicious cyber-attack.”

The database of traveler photos and license plate images was transferred to a CBP subcontractor’s network without the federal agency’s authorization or knowledge, the CBP explained. The subcontractor’s network was then hacked. BP said its own systems had not been compromised. Fortunately no other identifying information was included with the photos, and no passport or other travel document photos were compromised.

Images of airline passengers from the air entry and exit process were also not involved.

CBP’s “biometric entry-exit system,”is the government initiative to biometrically verify the identities of all travelers crossing US borders. which it is racing to implement so as to use facial recognition technology on “100 percent of all international passengers,” including American citizens, in the top 20 US airports by 2021.

The concern is whether that is urgency is ignoring vetting, and regulatory safeguards, and privacy legislation. Only last month, Perceptics, the maker of vehicle license plate readers used by the US government and cities to identify and track citizens, was hacked, and its files were dumped online. It is not clear whether the attacks were connected.

Summer discount on SQL protection tools

June 18th, 2019

Security, privacy, performance, and up time, all depend on a well maintained database with timely reports and alerts.
for an Enterprise system this is mission critical.
Few DBAs have the time and training to write and maintain a comprehensive set of scripts. Yet they also often lack the tools to do the job.
We offer fantastic suite of tools to help you to mange your SQL databases and with discounted bundled prices.
With privacy laws and compliance adding to the ever rising security threats, a well managed system is now often key to whether a customer will share data it you, and directly impacts whether you can secure and retain business.

Talk yo us about System health checks and administrator training, and how you can reduce administrative cost, and system risks with SQL management tools.
00971 43365589

Addressing WannaCry risks in your organization

May 30th, 2019

WannaCry—the most damaging cyberattack of 2017—continues unabated, with at least 3,500 successful attacks per hour, globally, according to research published by security firm Armis on Wednesday.
The research estimates that 145,000 devices worldwide continue to be infected, noting that “a single WannaCry infected device can be used by hackers to breach your entire network.”

The primary reason WannaCry persists is an abundance of unpatched Windows versions across healthcare, manufacturing, and retail sectors— a “large number of older or unmanaged devices which are difficult to patch due to operational complexities,” Ben Seri, research vice president at Armis, wrote in a blog post. The number of active Windows 7 (and older) installations across those sectors exceeds 60%,

This is in large part a vendor issue, because these industries rely on third-party hardware with poor lifetime support. There are operational reasons to hold on to old and unsupported Windows devices. Manufacturing facilities rely on the HMI (Human-Machine-Interface) devices that control the factory’s production lines. HMI devices run on custom built hardware, or use outdated software, that hasn’t been adopted to the latest Windows.

In healthcare organizations, many of the medical devices themselves are based on outdated Windows versions, and cannot be updated without complete remodeling.

In retail environments, the Point-of-Sale devices are the weak-link, based on custom hardware, which is late to receive updates if at all.

This is a particularly pressing issue, with the pending end-of-support for Windows 7 in January 2020. This which will serve to further complicate the security posture of many enterprises, especially as other “wormable” vulnerabilities are discovered, such as BlueKeep, which prompted Microsoft to provide patches for Windows XP and Server 2003 due to the potential risk the vulnerability posed.

The WannaCry attack had the potential of being much more damaging than it could have been, though for affected organizations, the damages were quite severe—the NHS reported losses of £92 million ($116 million).

Security researcher Marcus Hutchins, discovered a kill switch domain name in the program that was unregistered by the authors. When WannaCry executes, if the domain resolves, the program exits. While this bought additional time for defenses, WannaCry was reported as “stopped,” which may have lowered concern about the attack. Days later, a variant lacking a kill switch was discovered.

An analysis by GCHQ’s cybersecurity division identified the authors of WannaCry as the Lazarus Group, a North Korea state-sponsored threat actor, also responsible for the 2014 Sony Pictures hack. The US, Australia, New Zealand, Canada, and Japan have criticized North Korea for their involvement in the attack, according to ZDNet.

WannaCry is built on top of a pair of exploits called EternalBlue and DoublePulsar, which were released by an organization called The Shadow Brokers on April 14, 2017. The exploits were originally developed by the NSA Office of Tailored Access Operations and CIA Information Operations Center. The weaponization—rather than responsible disclosure—of those underlying exploits created an opportunity for the WannaCry attack to be waged.

, Microsoft president and chief legal officer Brad Smith condemned the “stockpiling of vulnerabilities by governments,” noting that “We have seen vulnerabilities stored by the CIA show up on WikiLeaks, and now this vulnerability stolen from the NSA has affected customers around the world. Repeatedly, exploits in the hands of governments have leaked into the public domain and caused widespread damage,” and “We need governments to consider the damage to civilians that comes from hoarding these vulnerabilities and the use of these exploits.”

To reduce potential risks from WannaCry patch your devices. That requires IT professionals to know that the devices exist. “Without the proper control and monitoring of devices and networks, organizations are bound to lose track of both,you must maintain a continuous asset inventory of all devices, and monitor your network for unknown, suspicious, or misplaced devices connected to it.”

Major SQL updates don’t skip – SQL Server 2016 SP2 CU7 and SQL Server 2017 CU 15

May 26th, 2019

This week, Microsoft released two major updates.

SQL Server 2016 SP2 CU7 has multiple fixes including:

• Filtered index corruption
• Access violations in sys.dm_exec_query_statistics_xml, sys.dm_hadr_availability_replica_states, sys.availability_replicas, sys.dm_db_xtp_hash_index_stats, sys.fn_dump_dblog, sys.dm_db_xtp_checkpoint_files
(I.e. if you monitor your servers, which you should, then you should apply this CU to avoid problems caused by the monitoring tool’s queries)
• AG failover fails
• Incorrect query results on columnstore indexes, and also this

SQL Server 2017 CU 15 has even MORE fixes, read the full list. https://support.microsoft.com/en-us/help/4498951/cumulative-update-15-for-sql-server-2017

Note also, that from SQL Server 2017, the Analysis Services build version number and SQL Server Database Engine build version number do not match

There are some CUs you might be tempted to skip because they don’t affect you. These releases will affect a wide range of features and you should plan to apply these sooner than later.

Micro-architecture data sampling a new security threat to chips

May 16th, 2019

To address a novel set of side-channel attacks that allow microarchitecture data sampling (MDS).
this week Intel released a set of processor microcode fixes, for operating system and hypervisor patches from vendors like Microsoft and those distributing Linux and BSD code

These side-channel holes can be potentially exploited extract information, such as passwords and other secrets, from memory it is not allowed to touch. Browser histories can be sniffed, virtual machines snooped on, disk encryption keys stolen, and so on.

MDS can expose sensitive data held in a processor’s internal buffers: store buffers, fill buffers, and load buffers. MDS samples snippets of data as opposed to grabbing it all at once – more like eavesdropping on privileged communications than breaking in. It’s not easy to target specific data or to differentiate valuable information from background noise. Chipzilla maintains the vulnerabilities are difficult to exploit outside of a laboratory environment.

However Tech Republic commented “MDS attacks are as pernicious a threat as Spectre and Meltdown, and like those security vulnerabilities, the extent to which devices are vulnerable depends on vendor (i.e., Intel vs. AMD) and product generation. These vulnerabilities also affect cloud computing services, as they can be leveraged by attackers to escape software containers, hypervisors, paravirtualized systems, and virtual machines.”

To make such attacks more efficient, an attacker might seek to have a targeted app running on the same physical core. on an adjacent thread from the malware so as to run load and flush operations repeatedly

Speculative execution is a shortcut used by modern processors to execute software instructions before they’re needed. That boosts performance but creates vulnerabilities – however those appear to be limited to Intel hardware; and have not been replicated on Arm or AMD-designed processors.
.
The researchers who identified the flaws argue that hardware fixes for the Meltdown vulnerability implemented in Whiskey Lake and Coffee Lake CPUs are not enough and that software-based isolation of user and kernel space – which comes with a performance hit – need to be enabled even on current processors.

Intel acknowledges there may be a performance hit due to the microcode fixes in some circumstances for some workloads.

- Whiskey Lake and Coffee Lake CPUs have mitigations built in
- Earlier processors need to install microcode fixes.
- Operating systems and hypervisors need to be updated to work with the microcode updates to ensure those function properly.

Patches are rolling out today from Microsoft, Apple, Google, Linux distributions, and others.

The store buffer is a microarchitecture element that turns a stream of store operations into serialized data and masks the latency from writing the values to memory. It stores data asynchronously so the CPU can do out-of-order execution. The operations for reassembling everything in the right order make Meltdown-like unauthorized memory reads possible.A technique called Data Bounce can access supposedly inaccessible kernel addresses and break KASLR (Kernel address space layout randomization), reveal the address space of Intel SGX enclaves, and even break ASLR (address space layout randomization) from JavaScript.Data Bounce is also invisible to the operating system- it doesn’t involve a syscall and it doesn’t trigger an exception.

Intel disagrees about the need to disable hyperthreading, and says it plans to add additional hardware defenses to address these vulnerabilities into future processors.

Security threats

May 13th, 2019

Security threats continue to haunt us.

Systems at a number of Baltimore’s city government departments were taken offline on May 7 by a ransomware attack. As of 9:00am today, email and other services remain offline. Police, fire, and emergency response systems have not been affected by the attack, but nearly every other department of the city government has been affected in some way.

Calls to the city’s Office of Information Technology are being answered by a recording stating, “We are aware that systems are currently down. We are working to resolve the issue as quickly as possible.”

Meanwhile this post on identify theft https://www.schneier.com/blog/archives/2019/05/protecting_your_2.html
and this one on credit card skimming on vulnerable e commerce sites make sobering reading https://arstechnica.com/information-technology/2019/05/more-than-100-commerce-sites-infected-with-code-that-steals-payment-card-data/

Wifi 6 what is it? Why does it matter?

April 13th, 2019

Wireless speeds will soon get a lot faster thanks to the introduction of Wi-Fi 6 later this year.

Wi-Fi 6 is the next evolution of wireless local area network (WLAN) technology and it will improve upon older Wi-Fi standards, especially with the coming release of 5G wireless technology. With Wi-Fi 6 and 5G emerging onto the market at roughly the same time, it would make sense that they’re somehow related; while both promise similar improvements, they’re distinctly different technologies.

The name Wi-Fi 6 is part of a new naming convention the Wi-Fi Alliance to make these more easily understood by Wi-Fi users, making it much like the 3G/4G/5G naming convention used by cellular data networks. Behind the Wi-Fi 6 name is the latest version of the 802.11 wireless networking standard: 802.11ax. This new Wi-Fi standard is reportedly up to 30% faster than Wi-Fi 5,. Speed hasn’t been the main benefit touted by the Wi-Fi Alliance and other industry experts; Wi-Fi 6 also brings lower latency, more simultaneously deliverable data, and improved power efficiency.

Latency is a significant problem especially for mobile, internet and cloud users i.e. just about everyone. Orthogonal Frequency Division Multiple Access (OFDMA), h is an improvement on Orthogonal Frequency Division Multiplexing (OFDM). OFDM is used by Wi-Fi 5, 4, and older standards to encode and transmit data form multiple clients or access points (APs) and to contend for the ability to transmit data; once the network is idle, data can be transmitted. OFDM is a popular and reliable way to decentralize access, but it has a major problem in that it can lead to serious latency.

OFDMA, makes a major change and puts the transmission coordination in the hands of 802.11ax APs. The AP centrally schedules data transmission and is able to further divide frequencies so as to transmit data to/from multiple clients at the same time. The aim is to reduce latency and increase network efficiency—especially in high-demand environments like stadiums, conference halls, and other public spaces. OFDMA broadcasts multiple signals at the same time, and can also increase the unit interval, which means outdoor Wi-Fi deployments will be faster and more reliable.

Wi-Fi 6 will extend the capabilities of Multi-User Multi-Input/Multi-Output (MU-MIMO). MU-MIMO was previously available only for downstream connections and allowed for a device to send data to multiple receivers at the same time; Wi-Fi 6 will add MU-MIMO capabilities to upstream connections to allow more simultaneous devices on one network. MU-MIMO, is already in use in modern routers and devices, but Wi-Fi 6 upgrades it. The technology allows a router to communicate with multiple devices at the same time, rather than broadcasting to one device, and then the next, and the next. Right now, MU-MIMO allows routers to communicate with four devices at a time. Wi-Fi 6 will allow devices to communicate with up to eight. As an analogy compare adding MU-MIMO connections to adding delivery trucks to a fleet, You can send each of those trucks in different directions to different customers. “Before, you had four trucks to fill with goods and send to four customers. With Wi-Fi 6, you now have eight trucks.

Extending the truck analogy OFDMA allows one truck to carry goods to be delivered to multiple locations. The network look at a ‘truck’ and see that it has only allocated e.g. 75 percent of the load capacity of that truck and this other customer is on the same route, so it fill up that remaining space with a delivery for the second customer. In practice, this is all used to get more out of every transmission that carries a Wi-Fi signal from a router to your device.

How fast is it?

– The short answer: 9.6 Gbps. compared to 3.5 Gbps on Wi-Fi 5.

– The real answer: both of those speeds are theoretical maximums that you’re unlikely to ever reach or need in real-world Wi-Fi use. The typical download speed in the US is just 72 Mbps, or less than 1 percent of the theoretical maximum speed. The fact that Wi-Fi 6 has a much higher theoretical speed limit than its predecessor is still important because that 9.6 Gbps can be split up across a whole network of devices which means both more devices or more potential speed for each device.

When Wi-Fi 5 came out, the average US household had about five Wi-Fi devices in it. Now, homes have nine Wi-Fi devices on average, and various firms have predicted we’ll hit 50 on average within several years. Those added devices take a toll on your network. Your router can only communicate with so many devices at once, so the more gadgets demanding Wi-Fi, the more the network overall is going to slow down. At first, Wi-Fi 6 connections aren’t likely to be substantially faster. A single Wi-Fi 6 laptop connected to a Wi-Fi 6 router may only be slightly faster than a single Wi-Fi 5 laptop connected to a Wi-Fi 5 router. Devices are more likely to maintain fast speeds on busy networks

As more and more devices get added onto your network, current routers might start to get overwhelmed by requests from a multitude of devices, Wi-Fi 6 routers are designed to more effectively keep devices up to date with the data they need. Each of device;s speeds may not be faster than they can reach today on a high-quality network, but they’re more likely to maintain those top speeds in busier environments. In a home where one person is streaming Netflix, another is playing a game, someone else is video chatting, and a whole bunch of smart gadgets — a door lock, temperature sensors, light switches, and so on — are all checking in at once the top speeds of those devices won’t necessarily be boosted, but the speeds you see in typical, daily use will get likely be better. Exactly how much fast will depend on how many devices are on your network and just how demanding are those devices. In a cloud world working on html 5 pages rather tcpip protocol, and with growing use of social media, digital storage, streaming video, AI, and querying data lakes its essential that the underlying infrastructure keeps up. We are seeing similar evolution with databases, chips and memory.

Wi-Fi 6 introduces some new technologies to help mitigate the issues that come with putting dozens of Wi-Fi devices on a single network. It lets routers communicate with more devices at once, lets routers send data to multiple devices in the same broadcast, and lets Wi-Fi devices schedule check-ins with the router. Together, those features should keep connections strong even as more and more devices start demanding data.

Wi-Fi 6 will also:
• Increase the number of transmit beamforming streams to eight in order to increase network range and throughput;
• use both the 2.4 GHz and 5GHz bands simultaneously to greatly improve performance;
• use 1024 quadrature amplitude modulation (1024-QAM) to increase throughput for emerging use cases (Wi-Fi 5 uses 256-QAM);
• implement individual target wake time (TWT) to improve battery life and reduce power consumption for Wi-Fi devices;
• introduce spatial reuse technology that will allow devices to more easily access a Wi-Fi network in order to transmit data.

Wi-Fi 6 allows devices to plan out communications with a router, reducing the amount of time they need to keep their antennas powered on to transmit and search for signals. That means less drain on batteries and improved battery life in turn. This is a feature called Target Wake Time, which lets routers schedule check-in times with devices.
Your laptop needs constant internet access, so it’s unlikely to make heavy use of this feature (except, perhaps, when it moves into a sleep state). This feature will be more valuable for smaller, already low-power Wi-Fi devices that just need to update their status every now and then. (Think small sensors placed around a home to monitor things like leaks or smart home devices that sit unused most of the day.)

Wi-Fi generations rely on new hardware, not just software updates, so you’ll need to buy new phones, laptops, and so on to get the new version of Wi-Fi. new devices will start coming with Wi-Fi 6 by default. As you replace your phone, laptop, and game consoles over the next five years, you’ll bring home new ones that include the latest version of Wi-Fi. There is one thing you will have to make a point of going out and buying: a new router. If your router doesn’t support Wi-Fi 6, then you won’t see any benefits, no matter how many Wi-Fi 6 devices you have. (You may however see a benefit, though, connecting Wi-Fi 5 gadgets to a Wi-Fi 6 router, because the router may then be capable of communicating with more devices at once.)

A new security protocol called WPA3. WPA3 makes it harder for hackers to crack passwords. For a Wi-Fi 6 device to receive certification from the Wi-Fi Alliance, WPA3 is required. (so be aware that it may not be included in uncertified devices.)

So where does 5G fit in ?
5G is the umbrella term for the fifth generation of mobile network technology, and it encompasses a lot of different elements. Cellular, or mobile networks, rely on licensed spectrum bands, auctioned off to the highest bidder. Carriers, like Verizon or AT&T, pay to use those bands. To roll out coverage they build a network of connected base stations capable of sending out a strong enough signal that it can serve multiple people (thousands in urban areas) at once. To recoup their investment, we pay them subscriptions.

Wi-Fi relies on unlicensed spectrum which is free to use, but the signal is relatively weak. We pay an Internet Service Provider (ISP) to deliver the internet to our door and then use a router to fill our house with Wi-Fi. We the same frequency band as our neighbors and that is a problem, when you live in a very densely populated area. The two frequencies that Wi-Fi uses are 2.4Ghz and 5Ghz. The 2.4Ghz has a lower potential top speed but it penetrates better, so it has a longer range than 5Ghz.

(Note that that 5Ghz Wi-Fi has absolutely nothing to do with 5G mobile networks.)

In every day life, most of us rely on Wi-Fi both at home and in the office — or in coffee shops — and mobile networks when we step out the front door and move out of range of the router. (Though for security reasons I would never recommend anyone to se a public hotspot)

Smartphones switch automatically and we don’t have to give it any thought, we just want a good connection at all times. That will continue to be the case for the vast majority of people after 5G rolls out. The difference is that both mobile networks and Wi-Fi are going to get faster. The prospect of download speeds between 1Gbps and 10Gbps, and upload speed or latency of just 1 millisecond, has us excited about 5G. The reality is that we will not get anywhere near the theoretical top speeds. The speed of your 5G connection will depend on many factors including: where you are, to what network you connect, how many other people connect, and what device you use.

The aim is to achieve a minimum download speed of 50Mbps and latency of 10ms. That will represent a major improvement over current average speeds, but just as with 4G LTE, 5G coverage is going to expand slowly. It’s also going to work hand-in-hand, not just with Wi-Fi, but with earlier generations of mobile network technology, so 4G LTE will continue to be offered as a fallback and will continue to evolve and get faster.

Goodbye XP

April 13th, 2019

This week we have the end of Windows XP support, which means the lifespan of the OS was over 17 years. That’s a long time to run any system without an upgrade. There probably still a few SQL 2000 systems out there, which is older than XP, and likely a few of them are running on Windows 2000 in a VM somewhere. Sosome companies will continue to run XP and provably some ATMs, kiosk displays, and other embedded applications will show that XP start screen on occasion.

Cloud back ups or on-premise?

February 16th, 2019

Pretty scary.
We have suffered catastrophic destruction at the hands of a hacker, last seen as aktv@94.155.49.9 This person has destroyed all data in the US, both primary and backup systems. We are working to recover what data we can.

Though they’re back up and running, who knows if customers will stick by them, or will sue them.
What impact that had on infrastructure mail servers, backup servers, and SQL Servers for customers is hard to judge.
A large number of people might have lost their mailboxes and previously stored mail that was in IMAP storage.
This is likely an annoyance for individuals, but potentially catastrophic for businesses. Imagine your small business hosted with them and all your mailboxes were lost with customer communications and who knows what else.

Could this happen with a cloud provider like Azure O365, Google Apps or AWS?
Maybe but they will have DR backups,
But what if you store back ups on the cloud but run on premise- how long would it take to mass restore multiple, customers? Do you still have ad3qute on premise test systems to restore on and the staff and the time to do it?

Do you assume that you will always have either a primary server and an online backup server/share/bucket/container and can download data.
The problem is that online systems that connect to the primary can be accessed.
If an attacker were to access one, they potentially could access the second.
The world seems to be moving towards more online storage, or in the case of cloud vendors, a reliance on snapshots. That might be good enough for cloud vendors, but is it good enough for your on-premise system.
It’s likely that an attacker, possibly even with insider help, would wipe out backups first, then primary systems.
Some sort of disconnected offline backup of data, especially database servers gives you a third line of defence.
don’t forget that back up- need to be tested- if the back up software compatible with old versions, does your back up use the same version as the current erp software installed on your primary, or the same SQL version (i.e when you upgrade do you also upgrade your back ups, or maintain an older environment?)

Microsoft and other large vendors have had downtime whether self induced by releasing code too early, or due to hardware failure, or malicious attach . What is important to realise is just how infrequent are just issues given the number of clients they have across a range of solutions, and how little was the downtime and how fast they are at in addressing issues that arise. The think about how you would have been able to deal with the same issues in your own server room?

There are increasing risks, and increasing issues of statutory compliance with regard to data protection e.g, GDPR. The cloud generally offers cheap storage nd robust systems, yet it needs to be part of a holistic approach to reduce overall risk and cost, and not the only line of defence.

What does GDPR mean for Big Data Analytics and AI?

January 27th, 2019

By 2020, there will be an estimated 24 billion internet-connected devices globally – more than four devices for every person. Many consumers have concerns about data privacy and how their data is used and protected (some surveys put this at 90% of users). As businesses learn to extract value from and utilize data at a deeper level, it is essential for companies to be extremely conscientious about protecting personal information.

The recent Google 50 Million Euro GDPR fine posted about on our blog has major implication means for data insight driven companies. Secondary processing of date using iterative analytics and AI needs to remain legal under the GDPR – i,e.GDPR compliant technical and organizational safeguards in place that:

(1) Satisfy a balance of interest test that requires functional separation (to separate the information value of data from the identity of data subjects) to reduce the negative impact on data subjects, so that the data controller’s legitimate interests are not overridden. see Annexures 1 and 2 of this note: https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2014/wp217_en.pdf

Recent high-profile lawsuits against Oracle and Acxiom make it clear that simply claiming a “legitimate interest” in commercializing personal data is not enough. (see the video here http://fortune.com/2018/11/08/privacy-international-oracle-acxiom/)

(2) Ensure compliance with requirements that the secondary processing is compatible with the original purpose for which the data was collected;

(3) By default restrict access to only the minimum data necessary for each purpose for which it is processed – such Data Minimisation, is a level of granular control and protection that cannot be technologies like encryption alone.

The “Data Privacy Day 2019″, which is tomorrow: Monday 28 January 2019, is led by the National Cyber Security Alliance (NCSA) in the United States, is built on the theme, “Respecting Privacy, Safeguarding Data and Enabling Trust.”

GDPR starts to bite

January 22nd, 2019

Google has been hit with a record fine by French data regulator CNIL, of 50m euros ($56.7m) for breaching GDPR after finding that Google had a “lack of transparency, inadequate information and lack of valid consent regarding ads personalisation”.
The regulator also said that the users were not sufficiently informed about how Google users personal data for advertising. The fine relates to two complaints filed by privacy advocacy groups, which were filed as soon as GDPR came into place in May last year. The groups also claim that Google does not not have a valid legal basis to process user data for ad personalisation, as mandated by the GDPR. Google also selects ad personalisation by default for new users, instead of offering an ‘opt in’, which is also against GDPR rules.

Under the GDPR, complaints are transferred to local data protection regulators. While Google’s European HQ is in Dublin, the CNIL concluded that the team in Dublin doesn’t have the final say when it comes to data processing for new Android users.

In a statement, Google said: “People expect high standards of transparency and control from us. We’re deeply committed to meeting those expectations and the consent requirements of the GDPR. We’re studying the decision to determine our next steps.”

The large fine reflect the view thatthe violations were continuous, and still occurring. Google’s violations were aggravated by the fact that “the economic model of the company is partly based on ads personalisation”, and that it is therefore “its utmost responsibility to comply” with GDPR.

Dr Lukasz Olejnik, an independent privacy researcher and adviser, said the ruling was the world’s largest data protection fine. “This is a milestone in privacy enforcement, and the history of privacy. The whole European Union should welcome the fine. It loudly announced the advent of GDPR decade,” he said.

Facebook is also faced with huge fines. Facebook has been fined €10m (£8.9m) by Italian authorities for misleading users over its data practices. The two fines issued by Italy’s competition watchdog are some of the largest levied against the social media company for data misuse, dwarfing the £500,000 fine levied by the British Information Commissioner’s Office in September for the Cambridge Analytica scandal– the maximum that body was able to issue. The Italian regulator found that Facebook had breached articles 21, 22, 24 and 25 of the country’s consumer code by: Misleading users in the sign-up process about the extent to which the data they provide would be used for commercial purposes.

Emphasising only the free nature of the service, without informing users of the “profitable ends that underlie the provision of the social network”, and so encouraging them to make a decision of a commercial nature that they would not have taken if they were in full possession of the facts. Forcing an “aggressive practice” on registered users by transmitting their data from Facebook to third parties, and vice versa, for commercial purposes.

The company was specifically criticised for the default setting of the Facebook Platform services, which in the words of the regulator, “prepares the transmission of user data to individual websites/apps without express consent” from users. Users can disable the platform, but the regulator found that its opt-out nature did not provide a fully free choice. As an additional penalty, the authority has directed Facebook to publish an apology to users on its website and on its app.

In a statement, a Facebook spokesperson said: “We are reviewing the Authority’s decision and hope to work with them to resolve their concerns. This year we made our terms and policies clearer to help people understand how we use data and how our business works. We also made our privacy settings easier to find and use, and we’re continuing to improve them. You own and control your personal information on Facebook.”

On Friday (14 December), Facebook disclosed that a bug gave hundreds of apps unauthorised access to photos that users had uploaded but hadn’t made public. The bug is understood to have ran for 12 days between 13 and 25 September. To compound matter it failed to promptly disclose the issue within 72 hours.

The bug is the latest in a series of privacy scandals. Facebook disclosed a security breach on Sept. 28, saying 50 million accounts had their login access tokens stolen. That figure was reduced to 30 million , and Facebook lconfirmed that 29 million of the impacted users had their names and contact information exposed. Among those users, 14 million of also had other personal information, such as their gender, relationship status and their recent place check-ins, stolen by the attackers. Facebook told the Irish Data Protection Commission that 10 percent of the affected accounts were European, according to Graham Doyle, the commission’s head of communications. the accounts were hacked in an access token harvesting attack. The security incident, revealed last week, was caused by a vulnerability in Facebook’s code which permitted attackers to steal access tokens. Access tokens are used to keep Facebook users logged in when they switch over to a public profile view via the “View As” feature.

A KPMG global study in 2018 revealed that 77% of consumer are totally against their data being sold.

A CNIL ruling in October last yearagaisnt the company Vectuary has a lot of significance. Data privacy experts consider the regulator was stating that consent to processing personal data cannot be gained through a framework arrangement which bundles a number of uses behind a single “I agree” button that, when clicked, passes consent to partners via a contractual relationship. That CNIL decision implies that bundling consent to partner processing in a contract is not, sufficient, or valid consent under the European Union’s General Data Protection Regulation (GDPR) framework.

The firm was harvesting personal data (including people’s location and device IDs) on its partners’ mobile users via an SDK embedded in their apps, and receiving bids for this data via another standard piece of the programmatic advertising pipe — ad exchanges and supply side platforms — which also get passed personal data so those can broadcast it widely via the online ad world’s real-time bidding (RTB) system to solicit potential advertisers’ bids for the attention of the individual app user… The wider the personal data gets spread, the more potential ad bids. CNIL discovered the company was holding the personal data of a staggering 67.6 million people when it conducted an on-site inspection of the company in April 2018 and yet Vectuary’s website claims it doesn’t store 70% of its data.

GDPR, Article 5, paragraph 1, point f, requires that personal data be “processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss.” If you can not protect data in this way, then the GDPR says you can not process the data. So the complint ius not just about the data or the consent but also about the processing. of the data sharing but rather that it is not adequately secure or controlled.

End of mainstream support for Microsoft Dynamics AX 2009, Dynamics AX 2012, and Dynamics AX 2012 R2

December 29th, 2018

Reminder – End of mainstream support for Microsoft Dynamics AX 2009, Dynamics AX 2012, and Dynamics AX 2012 R2 was ‎10‎/‎05‎/‎2018

Upgrade is not trivial especially when you have lots of customisations and bespoke reports and interfaces. Plan plenty of time for conversion and for testing and contingency. There is backlog of companies who need to migrate and only a limited number of skilled consultant available.

Decide as soon as possible whether on-premise or on cloud. If on-premise then consider what extra hardware you will need and whether you also need to upgrade SQL server. Don’t forget that SQL license costs have also changed.

It is not too early to start budgeting – find out what you get and don’t get on the cloud, there are both hidden costs, (e.g. extra back up storage space) and hidden savings (e.g. electricity). What extra environments or storage will you need e.g for dev and test over those used by Microsoft. How have license types and costs changed, understand the Modern Lifecycle Support update policy, .

Dynamics AX 2009 Service Pack 1 (SP1), Dynamics AX 2012, and Dynamics AX 2012 R2:
Mainstream support fends on October 9, 2018 after that date, only security hotfixes will be provided for these three versions through the extended support period that continues until October 12, 2021.

Dynamics AX 2012 R3
Mainstream support for continues through October 12, 2021. Microsoft will provide security hotfixes, non-security hotfixes, and regulatory updates for Dynamics AX 2012 R3 throughout that mainstream support period. The source code for these non-binary, non-security hotfixes and regulatory updates will continue to be available for customers active on the Enhancement Plan or Software Assurance.

Can customers on Premier Extended Hotfix Support or on Unified Support Advanced and Performance Levels get a non-security hotfix or regulatory update?

No. Neither non-security hotfixes nor regulatory updates will be available for Dynamics AX 2009 SP1, Dynamics AX 2012, or Dynamics AX 2012 R2 during the Extended Support phase of the product lifecycle.

While the ability to request a non-security hotfix for select products is included with Unified Support Advanced and Performance Levels, Microsoft has determined that non-security hotfixes cannot be provided with a commercially reasonable effort for these products. As a result, no requests for non-security hotfixes or regulatory updates will be accepted.

However, Microsoft will continue making security hotfixes, non-security hotfixes, and regulatory updates for Dynamics AX 2012 R3 throughout that mainstream support period. The source code for these non-binary, non-security hotfixes and regulatory updates will continue to be available for customers, and their partners, active on the Enhancement Plan or Software Assurance. Dynamics AX 2009 SP1, Dynamics AX 2012, and Dynamics AX 2012 R2 customers can selectively integrate those changes as required. Customers and partners can get the source code from packages attached to relevant Dynamics AX 2012 R3 KB articles published on LCS and discoverable through LCS Issue Search.

Will I still get a regulatory update for Dynamics AX 2009 Service Pack 1, Dynamics AX 2012, and Dynamics AX 2012 R2?

No, Microsft will only provide regulatory updates for Dynamics AX 2009 Service Pack 1, Dynamics AX 2012, and Dynamics AX 2012 R2 for regulatory changes with the law enforcement dates on or earlier than October 9, 2018.

What happens if a new bug is found by a customer in Dynamics AX 2009 Service Pack 1, Dynamics AX 2012, or Dynamics AX 2012 R2?

The bug must be reproducible in Dynamics AX 2012 R3. If it is reproducible and accepted, then a hotfix will be provided for Dynamics AX 2012 R3 and the customers can elect to integrate this hotfix in their version themselves, or work with their partners to integrate the changes.

How are binary hotfixes handled for Dynamics AX 2009 Service Pack 1, Dynamics AX 2012, and Dynamics AX 2012 R2?

If a hotfix is needed for a part of the system where Microsoft does not provide the source code and it is not a security bug, then a hotfix will not be provided.

To discuss a move to Dynamics 365 Finance and Operations call Synergy Software Systems your Dynamics Partner for over 15 years : 009714 3365589