Change to Azure Active Directory Multi-Factor Authentication Breaks Outlook

Posted by: mbmadmin | | No Comments »

I’ve been using MFA with Office 365 for some time. I was using the Authenticator app on my smartphone and entering the six-digit code when I was challenged by the MFA mechanism. However, I recently changed the MFA process so that the Authenticator app simply pops-up a notification asking for approval, dispensing with the six-digit number.

This worked nicely for the web and other application but, after a day or so, Outlook failed to connect and repeatedly showed. No-matter what password I entered, or how often, it kept coming back with this dialogue…

The solution was to run Microsoft’s Support and Recovery Assistant for Office 365, which is better known as SARA. You can download it here, https://aka.ms/Sara. Once installed the wizard looks like this…

Naturally, mine was an Outlook issue so I started on that path…

There was an exact entry for my issue so my choice was clear…

Obviously, if the tool is going to help, it needs to be run on the machine with the issue…

Enter your Office 365 credentials…

I’m using MFA for my Office 365 account, so I had to go through the MFA login…

Choose your work account…

Enter your O365 password…

Complete the MFA authorisation and you’ll be presented with this page in the wizard…

It looks like Outlook doesn’t work quite a smoothly with MFA as it should. It seems to need an App password. I don’t remember creating an App password, so I opted to click ‘create a new one’ and got sent to this page…

(It’s https://account.activedirectory.windowsazure.com/AppPasswords.aspx.) Initially, for me, this page was empty, there were no App passwords. Clicking ‘Create’ let’s you add one to the page, like this…

You’re asked to create a name for your password. I went with ‘My O365 App Password’ and clicked Next…

The wizard then creates you a strong password. You really need to make a note of it somewhere. Once you’ve got a copy, you can return to SARA. It’s this password that it’s looking for in the page from before…

SARA then does some checks…

Then…

It’s offering to fix Outlook so let’s try, click Yes…

Seems fair, let’s do it…

Close Outlook then Next…

Outlook starts and asks for the password. This time, it’s that new App password so paste it in. Then, as if by magic, Outlook connects and Email flows again…

Conclusion

It seems, for Outlook, it’s important to set and know your App password. I’m sure this will become smoother over time, but it’s a bit of a pain at the moment. I might try turning MFA off or switching authentication method again to see if that also breaks Outlook. Enabling MFA is something that is great for security and so should be done for all organisations, but we don’t want Outlook to break everywhere!

On a plus note, the SARA tool seems pretty good.

Posted in: Cloud, Office 365, Tip

Getting Started with Azure Sphere on the Seeed MT3620

Posted by: mbmadmin | | No Comments »

Getting Started with Azure Sphere on the Seeed MT3620 Development Kit

(Tip: Connect your development board using the lead that came in the box.)

  1. On your Windows 10 PC with Visual Studio 2017, install the Azure Sphere SDK Preview for Visual Studio.

2. You’ll need a business/school Microsoft Azure account, these have Azure AD which Sphere uses for access control.

3. You need to add an Azure Sphere Tenant to your Azure AAD. The SDK will have installed Azure Sphere Developer Command Prompt, use that to run…

azsphere login

4. You’ll probably see something like this…

This is because there’s presently no Sphere Tenant in you AAD.

5. Use this to create a new tenant…

azsphere tenant create --name <my-tenant>

In our case we’ll use “MBM Ltd” as my tenant name like so…

6. Next, take the advice on the screen and claim your development board so that it’s associated with your tenant, use…

azsphere device claim

7. Time to connect your device to your WiFi. From the Sphere command prompt use this to join the device to your WiFi…

azsphere device wifi add --ssid <yourSSID> --key <yourNetworkKey>

Check it’s got it with…

azsphere device wifi show-status

8. Time to update the device. Use this…

azsphere device show-ota-status

It can take a while. Give it half an hour. It’ll hang for a while so be patient.

Time for the sample app

9. Configure the device for debugging using…

azsphere device prep-debug

10. Let’s make some lights blink! Go to VS and create a new project. You’re looking for a Visual C++ Cross Platform project for Azure Sphere, it’s called ‘Blink Sample for MT3620 RDB (Azure Sphere)’.

11. With luck, the code should run straight away. Press F5 to build the project, send it to the board and start debugging. If you add a breakpoint on this line…

if (newButtonState == GPIO_Value_Low) {

(Line 96 in my version of the demo code.)

It’ll hit the breakpoint when you press Button A on the MT3620.

Conclusion

That was pretty simple. In the next article, I’ll show you how to deploy code over the air using a feeds and device groups.

Posted in: IoT
tagged with: , ,

Using DevExpress NuGet Server with Visual Studio Team Services

Posted by: mbmadmin | | No Comments »

We wanted to build an ASP.NET web app using the DevExpress tools but deploy it to Azure using Visual Studio Team Services CI/CD. The app ran nicely locally on the desktop but failed to build in VSTS. NuGet in the build agent claimed it couldn’t find the DevExpress packages. This is because the ‘NuGet restore’ step, by default, only uses the nuget.org repo. This is how to get better control over the repos the NuGet process in the build agent uses.

  1. Create a nuget.config file and add it to your solution in Visual Studio. It’ll look something like this; use your own DevExpress Feed URL…
<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <packageSources>
    <clear />
    <add key="nuget.org" value="https://api.nuget.org/v3/index.json" />
    <add key="DevExpress" value="https://nuget.devexpress.com/UseYourOwnNotOurs/api" />
  </packageSources>
</configuration>

2. Sync the new file to your VSTS code.

3. Edit the ‘NuGet restore’ step in your VSTS CI build…

DevExpress NuGet in VSTS

If you run a build now, NuGet will use your nuget.config file, find the repos you specified (including the DevExpress one), retrieve the packages and build your code.

It seems to work OK for us.

Posted in: Uncategorized

How to Build a Recommendations Service in Azure and Microsoft Cognitive Services

Posted by: mbmadmin | | No Comments »

How to Build an AI-based Recommendations Service in Azure and Microsoft Cognitive Services

Overview

We’ve all seen Amazon’s clever ‘Customers who bought this item also bought…’ feature on their website. Amazon use it to upsell and to expose parts of their catalogue that customers might not know about. It’s clear that Amazon spent a great-deal of time designing and implementing their solution but it is now within reach of small organisations too thanks to services like the one we’ll show here.

How it Works

You might be able to imagine for yourself how this kind of recommendations engine (or collaborative filter, to give it its technical name) works. In Amazon’s case, they upload vast quantities of sales data to an AI they’ve built and then, when you’re browsing their products, they use the AI to see what other products people bought who also bought the product you’re looking at. The site then shows this list as scrollable table on the product page. This all happens instantly in real-time.

In our example, we too will upload raw sales data, train the AI and then create a web service we can use in our website. This is done using the Microsoft Cognitive Services Recommendations solutions template running in Microsoft Azure.

Getting Started

You’ll need a working Azure account. If you have one already you can use that, otherwise, you can get a free Azure account here. Please sign-in to your Azure account now.

The Recommendations Solution Template

Go here to find the template we’ll be using. Here’s what it looks like at the time of writing…

Deploying the Solution

It’s important to understand that almost everything you do in Azure costs money and this is true of the resources created for this solution. Please give careful consideration to this as we move through the deployment steps.

Click the ‘Deploy’ button to start the process.

This is step one of the Cortana Intelligence Deployment wizard…

Complete the form to your requirements, here we’ve given the deployment a name (this will be used to create an Azure Resource Group), chosen our subscription and a location. Hit Create.

This page tells us what needs to be created for the solution to run, an App Service account, a Storage Account and Application Insights to monitor how it’s doing…

Click Next to move along.

This page asks which kind of App Service you want to create. Feel free to review their different specifications as this will affect the performance and functionality of the service.

For our purposes a Basic 1 account is fine. Click Next.

The next page is about replication in the Storage Account. You can either choose Local and Geo replication.

For our purposes, LRS (local replication) is fine. Click Next.

The next page is about Application Insights. It’s this that allows you to see the performance and usage of your new recommendations service.

Choose you closest region and click Next.

The template will now deploy the solution into your Azure account, it takes a couple of minutes and then you’ll get an Email and a new page with some important information on it…

All of those URLs and keys can be found within the resources created by the wizard but this is a handy consolidated view of all the important information so you might want to copy-and-paste it into a document for safe keeping.

Our New Recommendations Service Resource Group

Go to your Azure portal and look at your resource groups. You should see a new one named from the first field on the first page of the deployment wizard, in our case, recsrvc1. Here’s what you’ll find inside…

These are a set of pretty standard resources, an App Service Plan and its associated App Service, Application Insights and a Storage Account.

Loading Some Data

The wizard has deployed all of the back-end code into the Azure resources for you, so let’s take a look. From the final step of the deployment wizard, mark the Admin Key and copy it to your clipboard. Then click on the Recommendations UI link. You should see something like this…

Paste the Admin API Key into the field and click Login.

You’ll now see an empty recommendations solution with no model…

Train Your First Model

Click on Train New Model and let’s see what we need to do…

Hmm, looks a bit complicated doesn’t it but fear not, here’s what we need to do.

  1. Prepare your sales data
  2. Prepare your Storage Account
  3. Upload your sales data
  4. Fill-out the form
  5. Click ‘Train’
Prepare Your Sales Data

This page contains everything you need to know about this solution including the source code and some examples. The Getting Started Guide is pretty easy to follow. However, we’re interested in getting our sales data into the correct format, this document talks about that. Microsoft refer to the data as Usage Events, simply, it’s a big list of what you sold, who to and when.

The Usage file is a comma separated value (CSV) file in this format…

<UserID>,<ItemID>,<Time>

So an example might look like this…

456,RedBucketSmall,2018-01-09T16:05:26
456,BlueBucketSmall,2018-01-09T16:05:26
998,RedBucketLarge,2018-01-09T16:35:24

Here you can see that customer ID 456 bought a small red bucket and a small blue bucket at the same time (presumably on the same order), and then, shortly after, customer ID 998 bought a large red bucket.

This trickiest part of this exercise is to get your sales data into this format. It’s just these three items that you need; who, what and when.

Naturally, if you can get your raw data into Excel, that’s a great place to work with it and get it into the correct format. Getting the time into the correct format can be tricky, however, you can select your time column, right-click and choose Format Cells then ‘Custom’ and enter ‘dd/mm/yyyyThh:mm:ss’ into the Type field, this will format the time into the required format.

Once you’ve got your sales data into the correct format, you need to export it as a CSV to your local machine.

Preparing your Storage Account

Now that we’ve got our data in the correct format, we need to prepare the Azure Storage Account to hold the data. A convenient tool for this is the Azure Storage Explorer, this is a local application that allows you to interact with your Storage Account directly. Download and install Storage Explorer from here.

Open Storage Explorer, add your Azure Account and sign in.

The interface is much like with Windows File Explorer, you’ll see you Azure accounts on the left, navigate to your subscription, then the new Storage Account and finally the Blob Containers. You can see that that deployment wizard has already created some containers (rather like Windows folders) for the internal parts of the AI models, etc.

Right-click on ‘Blob Containers’, choose ‘Create Blob Container’ and call it ‘sales-data’. Then click on the new ‘sales-data’ container, click on the ‘New Folder’ icon in the toolbar and create a folder called ‘usage’.

Upload Your Sales Data

You now need to upload your sales data CSV file into that ‘usage’ container. Choose ‘Upload’ from the toolbar and select that you want to upload some files.

Use the three dots to find the sales data file you created and upload it. Our file is called UsageExport.csv…

Fill-out the Model Training Form

Now that we have our usage data in-place, let’s return to the New Model and complete it as best we can. Here’s what the completed form should look like; we’ve used ‘sales-data’ as the container name and ‘usage’ as the sub-folder. (Interestingly, the model will import all the files it finds in that folder but we just have one for now.)

I usually change, ‘Co-occurrence Unit’ to ‘Timestamp’ and set the ‘Decay Period’ to 60 days. You can view all the details for these options using the ‘Model Parameters Documentation’ link.

Click ‘Train’

Time to train the model, click ‘Train’. The model will now be created and trained using your data. It doesn’t take long for our 40,000 row data file but that will vary based on your dataset. Here’s how it looks when it’s finished…

Let’s Try it Out!

Click ‘Score’ to open a page that let’s you play with your newly created recommendations service. Try entering one your popular product IDs into the field and click ‘Get Recommendations’. Product ID 7 is popular for us, here’s what we see…

The results field shows a sorted list of products bought by other customers who also bought product ID 7. The Score value represents how confident the AI is about its decision.

If you enter a list of product IDs, like you would if the customer had items in their website shopping cart, you receive a set of results based upon all their items…

See how the list updates and the confidence score changes?

Your Website

In the real-world you’d need this to be integrated into your own e-commerce website and for that you’d use your web developers. The newly-trained model can be accessed using a REST-based API. It’s beyond the scope of this document but we can help with that integration and with the routine sales data updates that are needed to keep the model relevant.

Extra Stuff

What you’ve seen here is actually just the tip of the iceberg, this recommendation service has many customisable parameters and advanced options:-

  • You can, for instance, upload quoted items, clicked/browsed items, (as well as sold items) into the model but have them ‘weigh’ less in the AI’s decision making.
  • You can customise the results set for each specific customer based on their previous purchases.
  • You can artificially add weight to particular/new items to ‘promote’ them in the recommendations.
Overview

Adding a recommendations engine to your e-commerce website (or customer Emails*) could work very nicely for you. Amazon are tight-lipped about how well it works for them, saying only, “Our mission is to delight our customers by allowing them to serendipitously discover great products” but I think it’s clear to see that recommendations are critical to their e-commerce experience.

*-Don’t forget to use the recommendations engine to customise your newsletters for each of your customers. Amazon do this all the time too!

Next Step

We would be delighted to help you design, implement and maintain a recommendation service like this for you, please feel free to contact us or reach-out to me directly if you have any questions.

Author

Jason Timmins – Technical Director – MBM Ltd – 01902 324494 – jason@mbmltd.co.uk

Posted in: AI, Cloud
tagged with: , , , , ,

Connecting TTN to Azure Functions

Posted by: mbmadmin | | No Comments »

The Things Network and Microsoft Azure Functions

The Things Network LogoMicrosoft Azure Functions LogoOverview

We love The Things Network for low power LoRaWAN-based IoT hardware and we love Microsoft Azure Functions for server-less data processing. This article shows how to use them together.

The process involves creating an Azure Function to act as a webhook and then using the TTN webhook integration to send data to that function when data arrives from your devices. Further, the Azure Function can also send data back to the IoT device using the TTN integration, in the other direction (aka downlink.)

Create a Webhook-based Azure Function

Sign-in to your Azure Portal and add a ‘Function App’, choose the appropriate pricing model.

Inside your newly created Function App add a new function. This will show a list of function templates, find the HTTP Trigger function (in your preferred language – we’ll be using C#) and add it to the Function App. (You could probably use a Generic Webhook too but the Trigger template works nicely.)

The default sample code looks like this…

This function handles a web request, finds a ‘name’ parameter and returns a response saying ‘Hello Name’. However, in our case, we want to take the JSON that comes from the TTN integration and processes it. Firstly, you’ll need to add some libraries to the code, add this to the top of your code…

#r "Newtonsoft.Json"

using Newtonsoft.Json;
using System.Text;
using System.Net;
This will give us the JSON library for processing the returned JSON data and also the library necessary to send requests back to TTN.
Remove the body of the sample trigger and we’ll replace it with our own code.
Here’s how the body of the request from TTN can be captured…
// Get request body
dynamic reqbody = await req.Content.ReadAsAsync<object>();
byte[] data = Convert.FromBase64String((string)reqbody.payload_raw);
string decodedString = Encoding.UTF8.GetString(data);

This gives us a ‘reqbody’ object that contains all the data from TTN’s JSON and a ‘decodedString’ string that contains JSON of the data from your devices. The JSON structure of the TTN uplink can be found here – https://www.thethingsnetwork.org/docs/applications/http/. Most of the message is to do with the LoRaWAN and TTN network itself, it’s only “payload_raw” (and “payload_fields”) that actually contain the data from your devices. In our case, we only get “payload_raw”, this is a Base64-encoded JSON object which our code converts to ‘decodeString’ for use in your Azure Function.

This line will send the data from the devices and the TTN device ID to the log. (You can see the function logs by expanding the ‘Logs’ panel at the bottom of the function page.)

log.Info("Data: " + decodedString + " " + reqbody.dev_id);
Nothing will come through to the function at the moment because there’s no TTN integration so let’s do that.

Create a TTN Webhook Integration

Visit your TTN console and select that application you want to send data to Azure. Choose ‘Integrations’ from the menu bar and then ‘add integration’, the page looks like this…
Choose the ‘HTTP Integration’ and you’ll get it’s settings page…

You’ll want to take a look at the TTN documentation link as there’s useful stuff in there. For now, enter a sensible name of your integration in ‘Process ID’, then choose ‘default key’ in ‘Access Key’ field. The URL comes from your Azure Function so go back to your Azure portal and find the ‘</> Get function URL’ (top right) on your function source code page. Copy the URL to the clipboard, it’ll look something like this…

https://<your-app-service-name>.azurewebsites.net/api/<your-trigger-function-name>?code=<your-function-key>

Switch back to the TTN integration page and paste that URL into the ‘URL’ field. Leave all the other fields blank and save your new integration.

Uplink Testing

You now need to make your devices send some data, the TTN integration will then call your Azure Function using the URL you gave it and you’ll be able to see those function calls in the Log panel of the function portal. In our example a function log looks like this…

2017-12-15T15:24:52.146 Function started (Id=2d52f339-8c9c-487f-a58d-aad93349535f)
2017-12-15T15:24:52.146 C# HTTP trigger function processed a request.
2017-12-15T15:24:52.146 Data: {"distance": 52, "speed": "25"} pycom_lopy_01
2017-12-15T15:24:52.177 Function completed (Success, Id=2d52f339-8c9c-487f-a58d-aad93349535f, Duration=28ms)

You can see ‘distance’ and ‘speed’ data that came from the Pycom LoPy device in the TTN JSON’s message in the payload_raw element.

Sending Data Back – TTN Downlink

You can use your Azure Function (or Azure in general) to send data back to your IoT devices via the TTN integration. Each TTN JSON upload message has an element known as ‘downlink_url’, it’s this that contains the webhook URL to use to send a reply back to the device sending data. This next piece of code, builds a as JSON upload message in the correct format for TTN and sends it to the downlink_url mentioned in the initial message.

Here’s my code to send a reply (known as an downlink)…

 // Sending Reply

 // Get the downlink URL from the uplink message
 Uri ourUri = new Uri((string)reqbody.downlink_url);
 // Create a .NET web request
 WebRequest request = WebRequest.Create(ourUri);
 request.Method = "POST";

 // We're going to use a random number to set the colour of the LoPy's LED
 Random rnd = new Random();
 int r = rnd.Next(1,16777216);
 // Build the JSON that the device will interpret
 string replyJSON = @"{""colour"": " + r + "}";

 // Build the TTN JSON downlink message. Notice the Base64 conversion for the device message JSON
 string postData = @"{""dev_id"": """ + reqbody.dev_id + @""",""port"": " + reqbody.port + @", ""confirmed"": false, ""payload_raw"": """ + Convert.ToBase64String(Encoding.UTF8.GetBytes(replyJSON)) + @"""}";
 log.Info("Response: " + postData);
 byte[] byteArray = Encoding.UTF8.GetBytes(postData); 
 // Set the ContentType property of the WebRequest. 
 request.ContentType = "application/x-www-form-urlencoded"; 
 // Set the ContentLength property of the WebRequest. 
 request.ContentLength = byteArray.Length; 
 // Get the request stream. 
 Stream dataStream = request.GetRequestStream(); 
 // Write the data to the request stream. 
 dataStream.Write (byteArray, 0, byteArray.Length); 
 // Close the Stream object. 
 dataStream.Close();

 WebResponse response = request.GetResponse();

C# is not my first language so this could be prettier but it builds a downlink message and sends it back to TTN. When the device next connects to TTN, this message will be delivered and the device can act upon it… in our case, change the colour of its LED. It would be good form to monitor the WebResponse from TTN to make sure all is well but we ignore it at the moment.

Conclusion

We like Azure Functions because they are powerful, flexible and scalable and it’s nice to be able to wire-up TTN so that the process is seamless. We’ve not yet deployed this into a live IoT project but we’ve no doubt that it will run successfully and give you very little trouble.

Author

Jason Timmins – Technical Director – MBM Ltd – jason@mbmltd.co.uk

Posted in: IoT
tagged with: , , , ,

Which Level Of I.T. Support Do You Need?

Posted by: Sharon Kendal | | No Comments »

Micro Business Maintenance (MBM) provides a range of I.T. Support services including first-line (Level 1) and more challenging (Level 2 and 3), and works with businesses to create the right fit for their support needs.

Craig Banks, one of MBM’s I.T. support technicians, said: “It doesn’t have to be all or nothing when it comes to I.T. Support. At MBM, we’re happy to work with in-house teams and complement the skills that already exist, enabling us to add value where needed.

Many small and medium sized businesses employ a member of staff who has proficient I.T. skills to be able to handle common day-to-day issues. But what happens when they can’t solve an I.T. problem?

Some companies internally manage what is referred to as Level 1 first-line support such as: users not able to log on, printer not working, loss of internet connection etc. As an I.T. issue becomes more difficult to solve, it moves up the ladder to Level 2 and then Level 3 and so on. Each of these Levels requires more advanced technical knowledge which is often not available in-house.

Some clients only use MBM to investigate and manage more difficult Level 2 and 3 I.T. issues. If a business doesn’t need or can’t afford to outsource their whole I.T., but they recognise that they need some support, splitting the service delivery can provide an affordable solution.

For more information about MBM’s Business I.T. Support services call us on 01902 32 44 94 or click HERE.

Posted in: IT Support

I.T. Director to Swim Channel!

Posted by: Sharon Kendal | | No Comments »

Nigel Mills, managing director of Micro-Business Maintenance Limited, is swapping his motorbike for the water as he takes part in a swimming challenge to raise money for a spinal cord injury charity.

Nigel has signed up to take part in the Aspire Channel Swim 2017, to help raise awareness of spinal cord injuries and the affect it has on people’s lives. Although he won’t actually be swimming the English Channel, he will be swimming the 22 mile distance over nine weeks in his local pool in Wombourne. For more information about the challenge visit their website HERE.

Nigel said: “Every eight hours, someone is paralysed by a spinal cord injury. I found out about the work of the Aspire Charity and felt compelled to help. Swimming the distance of the English Channel is a big challenge, but the more I raise, the more difference it will make to the people the Aspire charity helps.”

Friends, family and business colleagues can donate HERE

Aspire is a national charity that provides help to people who have been paralysed by a spinal cord injury. They support over 40,000 people to live fulfilled and independent lives by providing essential equipment, advice, housing and grants. For more information about the charity visit www.aspire.org.uk 

Posted in: Uncategorized

Certificate of Quality for MBM

Posted by: Sharon Kendal | | No Comments »

Micro-Business Maintenance Limited (MBM) is delighted to have passed their recent Quality Audit. The QA Certificate, which they’ve held since 2013, confirms they operate a Quality Management System which complies with BS EN ISO 9001:2015 for the ‘provision of IT Support Services’.

Nigel Mills, managing director at MBM, says: “We’ve been providing IT support services to businesses across the West Midlands since 1990, however, a certificate provides real evidence to our customers that we operate a quality service.

He added: “Whilst MBM and our technical staff are accredited with a number of organisations and bodies related to the IT sector such as Microsoft, Adobe, HP etc, ISO is an international quality standard that businesses understand.

For more information about MBM’s quality IT support, call 01902 32 44 94, email sales@mbmltd.co.uk or visit our website.

Posted in: Testimonial

Microsoft Azure

Help With Azure On Offer

Posted by: Sharon Kendal | | No Comments »

MBM Limited have launched a new service for local businesses who need help with Microsoft Azure.

Jason Timmins, technical director at MBM, explains: “Moving to the Azure cloud can be a cost-effective way to improve the performance, reliability and scale of your in-house and customer-facing systems. However, we recognise that some businesses don’t have the resources or skills in-house to manage the technical areas of an Azure cloud environment, and we’re therefore offering help to manage new or existing implementations.”

Microsoft Azure is a cloud environment that can be used to build, deploy and manage Microsoft and non-Microsoft applications and services through their world-wide network of data centres. It is ideal for businesses looking for a hybrid approach to their hosting, ie a mix of on-site and off-site resources; and it provides a range of technical solutions including virtual servers, web hosting, content delivery, back-end databases, IoT, AI/Machine Learning and more.

For information about MBM’s Microsoft Azure support service, call 01902 32 44 94, email sales@mbmltd.co.uk or visit the web page here.

Posted in: Cloud

Cyber Essentials - Protect Your Business

Support for Businesses Worried About Cyber Attacks

Posted by: Sharon Kendal | | No Comments »

According to MBM Limited, a Cyber-attack is one of the main IT worries facing businesses today.

However, many companies are not aware that the UK Government has developed a scheme to help businesses implement controls that they can put in place to mitigate risks from common Internet threats. The scheme, called Cyber Essentials, offers basic measures that any type or size of business can implement.

Jason Timmins, technical director at MBM, explains: “Whilst the Cyber Essentials Scheme does not address the more advanced cyber threats, it does provide a very good set of controls to provide basic protection from more common Internet threats.

The Government’s Cyber Essentials scheme covers areas such as firewalls, configuration, user access control, malware protection and patch management. MBM can help businesses identify their IT systems that may be at risk. MBM can complete the self-assessment questionnaire for the company and, if required, assist with the implementation of any actions to ensure the basic cyber security protection is in place.

Jason added: “Companies who gain the formal Cyber Essentials certificate, are demonstrating to their own customers and suppliers that they understand, embrace and have implemented cyber security protection measures and are a cyber secure business.”

For more information about MBM’s Cyber Essentials support service, call 01902 32 44 94, email sales@mbmltd.co.uk or visit https://www.mbmltd.co.uk/it-services/it-security/government-cyber-essentials-scheme

Posted in: IT Security