Cisco Live Day 4!

Today was the final and busiest day of Cisco Live 2018, with it being the last day of World of Solutions and also the Appreciation event. I decided I needed to make the most of it so I planned in a packed schedule starting with my first session of the day ‘Penetration Testing for Network Engineers – Know yourself and Enemy’. Now I’m not an out and out ‘Network Engineer’ but in my role as a Technical lead on Cloud and Security at Concorde, I felt it was important to get an insight and build up my knowledge on Penetration testing as well as some of the challenges that come with it.

Some of the facts in this session were a bit hard to comprehend like it is understood that at the rate we are going there will be 500 Billion IoT devices in the World by 2030. When I thought about that number of devices and how would it be possible to ensure all of them are secure and protected it did start to hurt my head a little. There was also a lot of information in this session around companies that have tried and failed to secure their devices within their network and the amount of companies that have been hit with some sort of Cyber/Ransomware attack in the last 12 months is very high, and growing every day.

As a company what options do you have? Option 1 would be ‘Hope that someone else fixes it’, or the wait and hope approach as I like to call it. Option 2, however, is to ‘validate what is going on’. By doing an audit, assessment and a final Pentest on your perimeter network you can ensure the safety of your environment, and if there are any gaps you’re in a position to close them. Why wait until after the fact of an attack to sort it when we can be proactive? I really found this session to be full of valuable information and it gave me more of an insight into the value of PenTesting.

As I mentioned before I wanted to make sure I didn’t waste my final day at Cisco Live so after a short 30-minute coffee break I was straight into my second session ‘FirePower Platform Deep Dive’.

Firepower slide

Currently, my only experience with FirePower has been with it working with an ASA firewall but since then it has developed into its own product line now which is rich with features. In this sessions title you will also notice the word ‘Platform’ this is another example of a recurring theme from this week, that Cisco is no longer doing ‘Solutions’, but ‘Platforms’. As the name suggests this session was very technical right from the start and gave a lot of good information around the different hardware models and the software that runs them. The one thing I didn’t like during this session was the version of CLI in the product as it was different from the traditional Cisco CLI we all know and love. This one, however, was more based on XML. The product does have a great friendly user interface but I was disappointed with the CLI being made different as 98% of the Cisco product range uses the same CLI, why change it for random appliance ranges?

The 3 hardware products within the FirePower range are the 9300, 4100 and 2100 with the 9300 being the more top end model down to 2100 being the more entry-level model. As with most Security appliances, these are all policy based, and as mentioned earlier the session was very technical and went deep into the Architecture of the device including the underlying software that runs it as well as the licensing model.

models.jpg

After this session, I had an hour break until my next Session so I decided to head down to the World of Solutions for one last time. Similar to my last few visits I just walked around, however, there was one stand which took my interest which was a company called ‘ScienceLogic’ – Hybrid IT service Assurance. Their product is an MSP level monitoring tool but instead of it being probe based on a lot of Monitoring platforms out there it is API based. This means no agent needs to be installed on the local infrastructure, and it has a multi-vendor support so it can monitor a host of different devices makes and models. If I’m honest a lot of the information on the stand was ‘Marketing’ based, but I did have a very insightful chat with the brilliant stand team members. However I am someone who needs to ‘see it to believe it’, so we exchanged information and my hope is to do a webinar demo on this product in the very near future!

The final session on day 4, and sadly the week for me (still have the closing Keynote and the NOC panel discussion tomorrow), was ‘Cloud Managed Mobility with Meraki & EMM’. With this session, I was more interested in the ‘Enterprise Mobility Management’ part as I am already familiar with Meraki and its products range. The one surprise for this session was that rather than spend 2 hours talking the presenter decided to do a ‘Live Demo’. This was a great contrast from over sessions, where they had done some demoing but none of it lives and off the cuff, and it made for some interesting viewing. As we all know, when it comes to living events something is always bound to go wrong, and this demo was no different. However what made it great for me was that the presenter was having a real-life issue that 99% of the people in the audience would possibly have if they were doing this back in the office, and it was great to see how a Meraki specialist handled overcoming that issue. The Enterprise Mobility manager with the Systems managers looks like a great tool within the Meraki Dashboard which allows customers to manage and protect their devices using policies and tags. With more and more customers evolving towards BYOD, tools like this are ideal for ensuring those devices are compliant with your company policies.meraki-slide.jpg

 

The 4th day was now coming to a close and all that was left before the Appreciation evening was the Closing Keynote with Guest speaker Burce Dickinson, lead singer of Iron Maiden! Now they are a bit before my time so I wasn’t overly sure who he was but he shared many of his life experiences including a lot of the businesses he has run in the past, including ownership of an Airline! Unlike the opening keynote this was not really based on a Cisco message or content, but more of an inspirational talk which was still a fitting end to the day.

rudamental

The Appreciation night kicked off with the sound of very loud drums which was followed by a great night of ‘food, drink, dance and repeat’. For more highlights of the day, don’t forget to check out my twitter feed @shabazdarr and Concorde’s feed @concordeTG. The final summary blog of the week will be released early next week so please keep your eyes peeled for that!

Shabaz Darr

 

Author: Shabaz Darr, Senior Professional Services Consultant at Concorde Technology Group

Advertisement

Cisco Live Day 3!

My day 3 started off a little bit later than usual as my first session wasn’t until the late morning where I attended ‘Security Monitoring with Stealthwatch: The detailed Walkthrough’. This session was based around a product I have heard of and read much about before but I have not had any hands-on experience, so my hope was that this detailed walkthrough would be a good starting point for me to get a closer look at this product and what it can offer.

One recurring theme and question I have seen throughout the last few days is how do we get the most out of our data?. Every day we are collecting all of this data that comes into our network, but how do we make it relevant and actionable?. The purpose of this seminar was to show how Stealthwatch can help make your data relevant and usable as to allow the customer to analyse potential threats and learn how to prevent them from entering your network.

Once the overview of the product was finished I felt that a lot of the detailed information was lost on me, it seemed to be more aimed at people who already use Stealthwatch but want to get more from the product. I did, however, take a lot away from this session and for moving forward I would like to create some time to run this in one of the hands-on labs over the next few days.

After a short coffee break, it was straight into session two for: ‘Best practises to deploy high-availability in Wireless LAN Architecture’.

 

seminar.JPG

This was a session I was really looking forward to as I already know a lot about Cisco’s Wireless range, and for me being able to understand how to make it highly available was a big bonus. The session did not disappoint, and after going through the introduction and current product range we started taking a closer look at the considerations around HA. One quote the presenter made which hit home for me was ‘Site Survey, site survey, site survey’. Too often I feel customers don’t realise the importance of doing a site survey and they see it as a waste of money, however, these surveys enable us to understand the environment and ensure we can implement a robust solution to fit their environments landscape.

My three main takeaways from this session are:

  • HA for Wireless is a multi-level approach
  • The solution you choose is based on the amount of downtime that is acceptable for your customers business application
  • SSO on the controller eliminates the network downtime upon a controller failure

With back to back sessions, I decided to head down to the HUB for some lunch and a look at what labs were available today. I was hoping to do one on Stealthwatch, however, that was not on the list for today, and with nothing else taking my fancy, I decided to head over to the DevNet Sandboxed labs to go through some of my CCNA labs.

My final session of the day was ‘Security Meets SD-WAN with the Cisco Meraki MX’. I have done many Meraki installations over the last few years, so the MX was something I was looking forward to learning a bit more about. For those who are not aware of Meraki, it offers a complete Cloud managed network solution for Wireless, switching, security, SD-WAN, Communications and even security cameras. My own experience is that Meraki is more known for its Wireless products and I feel Cisco Meraki has not promoted/marketed the other ranges as much as they could have done. I have however have seen a change in recent months and feel this year could see a big increase in the Meraki portfolio sales.

The main benefits of a cloud-managed solution are:

  • Security
  • Reliability
  • Scalability
  • Future proofing

In my opinion, the two key elements are the last two listed, as with hardware and on-premises solutions it can be sometimes difficult and costly to upscale and ensure you future proof due to hardware constraints. One thing I was not aware of until this session was that Meraki Cloud is backed by Cisco Talos threat intelligence, which for me is massive. Not only does it increase the cloud security but means it is always getting threat updates and learning. In addition to the IPS and Advanced Malware protection, Cisco Meraki is ensuring that their Cloud Platform is secure and has the capabilities to stays secure in the future. Again another theme for me this week is the real world examples, and this session was no different. For me, it added that bit of realism to the product that I needed to believe in what I was being told in the session. The 2nd half of the session got more into the technical aspects of the SD WAN deployment which was the new part for me along with the in-depth Event and URL logs which I was very impressed with.

That wraps up my take on day three. I must admit the amount of walking was really starting to take its toll so I was starting to tire towards the end of the day, however, I have once again taken so much away from my seminars today as well as the labs I did. For more information on the day take a look at my twitter feed @shabazdarr and Concorde Technologies feed @concordeTG

Shabaz Darr

 

Author: Shabaz Darr, Senior Professional Services Consultant at Concorde Technology Group

Cisco Live Day 2!

After a more relaxed start to Cisco Live on day one, I expected a much more hectic atmosphere on day 2 and that’s exactly what I got.  My focus for today and the general theme throughout the week is to see what Cisco has to offer when it comes to both Security and Cloud. I was also looking forward to the World Of Solutions opening to see what new innovations Cisco’s vendor partners are doing as well.

Before any seminars got underway the morning was focused on Keynotes, both the event Keynote and Partner Xperience Keynote. As I expected the opening Keynote was based on what Cisco will be doing in the upcoming year and on the foundation of digital transformation.  My favourite part of keynote talks, in general, is always the customer case studies as it puts a realistic spin on what the overall message is. Which from my point of view was how Cisco is at the forefront of Digitisation and technology innovation.

Key Note Session

The second Keynote was the Partner Xperience which was kicked off by Wendy Mars.  This keynote was for Partners only and again I took so much away from this talk, which was one of the main sessions I wanted to attend. Cisco has held the discussions and seminars in a way that its customers and partners are getting valid and usable information, rather than just a  ‘Marketing Message’ which I have experienced at other vendor events in the past. Don’t get me wrong there is a need for that type of message but my experience so far is that Cisco is getting a great balance in the seminars and talks. 

The whole Partner Keynote was very much based on how Cisco can help its Partners in overcoming ‘Customer Challenges’ as well as Partner Challenges. My main takeaway from this keynote session was that when I am putting together a solution am I future proofing it? Am I thinking about the ‘end-to-end’ solution? Or just the specific problem at hand?. Something else to come out of this talk was how Cisco is evolving their approach, and rather than solutions they are offering ‘Platforms’. 

Automation was another key theme throughout the talk and ACI was touched upon for Data Centre level network automation but the final takeaway from this talk was the power of Data. One quote that really hit home for me was: ‘If you don’t collect data, what can you analyse?’ How true is that!? How do we know what threats are attacking our network if we don’t collate this information into a usable format?. Cisco has a vast range of products to aid partners and customers with these types of challenges.

The Keynotes were now done for the day and my first session was one I initially had mixed feelings about when I put it on my schedule: ‘Unlocking the value of IoT Data’.  The main reason for this being the fact that I am very new to IoT (Internet of Things) and wanted to understand more about it.  This seminar was based on two Cisco products: Cisco Kinetic and Cisco Jasper. This was again my first experience with both of these products but I must say they are very impressive. Within these two products, Cisco covers both IoT for OT and IT networks (Kinetic) and IoT for Cellular networks (Jasper).  The most eye-opening information for me was that one of the slides which had been updated this morning was already redundant.  The slide in question showed how many Enterprises and devices were currently using the Cisco Jasper platform, however, the numbers had changed so much so that the numbers on the slide was extremely inaccurate – in half a day!! That is growth and it just goes to show that Cisco is at the forefront of IoT.

Cisco Jasper

After lunch, I headed over to the World of Solutions which was by far the highlight of the day! I must admit at first I was fairly overwhelmed by the sheer amount of stands, people and different aspects of visiting. I just had no idea where to start!  I, therefore, decided that with it only being day two that I should get my bearings a bit and just walk around and see what there is to offer.  I have put some pictures on my twitter feed @shabazdarr with highlights of what was on offer. 

Now I want to tell you about the most interesting part of my day which was my visit to the World of Solutions and my discussion with the Cisco Security team around their Email security products including AMP.  They were kind enough to show me a demo of the products, but one thing I saw which was impressive was that even though an email has been allowed through the Email security appliance (be it cloud or on-premise), it continues to be monitored in case a threat has been missed. As soon as the threat is detected it is deleted!! Now, this could be hours or even days after the email has been received, but for me, I have not seen many products out there which offer this. Again the integration between products was very impressive, with Cisco AMP integration with Cisco Umbrella it just adds to the user’s security platform.

The final talk of the day was the C-Max theatre for ‘Changing the Security Equation’.  To be honest there wasn’t much in this talk which I didn’t already know.  It showed more detail around what products Cisco have to protect Networks and Infrastructures from Cyberthreats including the Next-Gen Firewall (NGFW), Cisco Umbrella and Cisco StealthWatch.  It is great to hear about these products, but for me, I need to get my hands on them to really understand what they can offer, so keep an eye out for upcoming blogs this week after I have done some more hands-on labs!

That wraps up my take on day two.  A really interesting start with the seminars and talks and it has given me a good platform for the rest of the week.  For more information on today take a look at my twitter feed @shabazdarr and Concorde Technologies twitter feed @ConcordeTG.

 

Shabaz Darr

Author: Shabaz Darr, Senior Professional Services Consultant at Concorde Technology Group

Cisco Live 2018 – Day 1

cisco live s 2

Today I kicked off my Cisco Live 2018 conference in Barcelona. This is my first visit to the city and my first Cisco Live event which I am very excited about. Especially after I’ve carried out a lot of research in the build-up to this event by looking back at previous years to learn what to expect while attending.

Within my role as Senior Professional Services Consultant at Concorde Technology Group, I act as the technical lead for Cloud and Security. With this in mind, my focus for the week ahead will be based around these Cisco technologies and I have booked in a number of sessions throughout the week, which cover both of these topics (more blogs to come in the next few days).

My focus today, however, was to explore what else the event has to offer (which is a lot), including some hands-on labs, and visit some of the different Cisco booths including meeting the engineers and the very impressive DevNet Sandbox team!

After the very easy registration process, I decided to take a trip to the “Hub” area to explore what it had to offer. Being a very hands-on type of person I was itching to do some labs, so I went across to the DevNet Sandbox team. The DevNet Sandbox offers a host of different free labs that customers and partners can take part in while in a sandboxed environment. The subject matter is also very impressive and includes multiple Networking, Cloud and Security sandboxed environments. I spent about 1 hour with the DevNet team going through the process of accessing the material and I was really impressed with the content and the skill set it covers. Regardless if you are new to the subject or very technical and need to test something, the content covers it all.

For more information, I would recommend visiting developer.cisco.com/sandbox and have a play with the different sandboxed labs that are available.

Speaking to the DevNet Sandbox team made me, even more, itchier for some hands-on labs so I headed over to the ‘Walk in hands-on labs’. Again the subject matter covered a lot of different aspects including Cloud, Security and IOT (to mention a few). One thing on my Cisco Live bucket list was to set up and demo Cisco Umbrella. For those of you reading that are unaware of this product, Cisco Umbrella uses the Internet’s infrastructure to block a malicious destination before a connection is ever established and delivers security from the cloud.

The lab I took part in included setting up this platform for an internal network, Integrating with Active Directory, configuring and installing roaming clients (both AnyConnect and Umbrella) and customising different policies. Firstly, setting up the portal and adding my internal network to use the service took minutes which was a big surprise. All I needed to do was add my public IP address to a network, and update the DNS forwarders on my Domain controllers to point to Open DNS. The only part it didn’t cover in the lab was a requirement for opening ports 443 if you have a firewall, as in the lab environment was already open.

Once my network was working with Cisco Umbrella I was then able to configure the Active Directory integration. This allowed Umbrella to listen to the Domain controllers for user and computer logins via Security Event log, as well as enabling IP-to-user and IP-to-Computer mappings for the reporting which is a feature a lot of customers require when it comes to reporting. The roaming clients feature was also very impressive as it enables you to protect machines that are not on your network. You are able to use both Cisco AnyConnect (if you already use this) or the Umbrella Roaming client (If you don’t use AnyConnect) and install this on devices that are going to be away from your internal network. I was really impressed at the ease of deploying this, which included a ready to download script which can be deployed via Group Policy.

The final part of the lab was configuring the policies which dictate the security protection settings as well as block/allow lists and policies that control log levels and block splash page. All in all, I found the Cisco Umbrella product very user-friendly and a comprehensive tool that will protect your network and Infrastructure against malware and virus attacks from outside your network. The one drawback is that this product does not cover email scanning, but does include some level of SMTP protection.
The final part of my day included a visit to the Cisco Live library for some last minute cramming for my CCNA Routing & Switching exam…but you will need to wait for another blog later this week to hear about that.

If you want to keep up to date with the events as they unfold here at Cisco Live feel free to follow my twitter feed @shabazdarr as well as Concorde’s twitter feed @ConcordeTG.

Shabaz Darr

Author: Shabaz Darr, Senior Professional Services Consultant at Concorde Technology Group

 

KRACK Attack – WIFI vulnerability – What does it mean to you?

WiFi router

You may have seen in the press that a vulnerability has been identified in the WPA2 Wireless encryption protocol. So what is this vulnerability and what does it mean to you?

Security researchers have discovered a number of vulnerabilities in the WPA2 (WI-FI Protected Access II) protocol. These vulnerabilities may allow attackers gain access to private transmitted data traversing your wireless network.

KRACK, Key Reinstallation Attack, has been able to demonstrate the ability to unencrypt wireless communication on multiple platforms, including Windows O/S, Apple IOS, Android and Linux.

So far the following protocols are vulnerable to the attack:
• WPA
• WPA II
• WPA-TKIP Cipher
• AES-CCMP
• GCMP

The flaw is not in the cryptography underlying WPA2 or its predecessor, WPA. Rather, it’s in the implementation. When communicating with a client device to initiate a Wi-Fi connection, the router sends a one-time cryptographic key to the device. That key is unique to that connection, and that device. This is so that a second device on the same Wi-Fi network can’t intercept and read the traffic to and from the first device to the router, even though both devices are signed into the same Wi-Fi network.

The problem is that that one-time key can be transmitted more than one time. To minimise connection problems, the WPA and WPA2 standards let the router transmit the one-time key as many as three times if it does not receive an acknowledgement from the client device that the one-time key was received.

Because of that, an attacker within Wi-Fi range can capture the one-time key, and, in some instances, even force the client device to connect to the attacker’s bogus Wi-Fi network. The attacker can use the one-time key to decrypt much of the traffic passing between the client device and the router.
So what does this mean to you

Many vendors have already issued patches to mitigate this security vulnerability. Users are recommended to update/apply patches to their WI-FI enabled equipment. This includes routers, user devices and smartphones.

Contact Concorde Cyber Security on 03331 300600 or email enquiries@tctg.co.uk for more information on how you can protect your business from the latest vulnerability!

 

Author: Carl McDade, Concorde Solutions Architect

How can Concorde Cloud Solutions help with your data protection strategy?

 

Backup and Disaster Recovery has always been a consideration for all customers when it comes to IT. With the evolving security landscape and the recent high profile ransomware attacks, the ability to provide availability of data is becoming ever more important. Long gone have the days of daily backups and taking untested tapes home.

With all that in mind, having a strong data protection strategy is key to any business. Understanding how you can protect your data, where that data sits, how that data is protected and how that data can be retrieved are all important factors in building this strategy.

So, how can you protect your data?

Protection of data isn’t just backing it up, it’s making that data available. Offline backup windows are becoming non-existent, protection of that data is required 24/7 as businesses look to drive down their RPO and RTO.

Using Veeam Backup and Replication, you have the ability to protect your critical workloads and data throughout normal operating hours. Veeam’s ecosystem of technology partners allows the protection engine to leverage vendor API’s to efficiently protect those workloads. This integration with market leading vendors, allows the protection of data to occur as often as you would like, whilst minimising the impact to the production workloads

Using this technology, you can develop your strategy to meet the RPO/RTO set by your business for each application. If you have a tier 1 application, which cannot suffer more than 15 minutes’ data loss, Veeam can help you meet that. If you have a tier 3 application which cannot suffer more than 4 hours’ data loss, you can tailor Veeam to meet that too.

Where can that data be located?

Although the data protection schedule is important, developing where that data sits is equally important. If part of your strategy is the protection of your primary data centre, it’s no good only having two copies of your data at that same site.

Veeam have put together a 3-2-1 rule to help customers adhere to best practices. 3-2-1 equates to 3 copies of your data, 2 different media types, and 1 copy offsite.

carl image 1

3 copies of your data include the original data set. So this rule encourages you to have two additional backups of your original data to ensure that data can be available in event of failure. 2 different media types ensure that this data isn’t all stored on the same type of device. Having different media sets, which can all be managed by Veeam, reduces the risk of failure. 1 copy offsite provides primary site protection.

How can you get your data offsite?

Getting data offsite has always been the sticking point in historical data protection strategies. This element may have been reliant on individuals taking data offsite, or expensive storage and collection fees.
Back in 2014, Veeam released the Cloud Connect feature as part of their V8 update to Veeam Backup and Replication. This feature was a catalyst for businesses to extend their data availability strategy to Veeam partners offering Backup-as-a-Service. This model was heavily consumed by large numbers and three years on, the Veeam Cloud Connect possibilities are growing with the current iteration, Veeam Backup and Replication V9.5

Veeam Cloud Connect allows customers to connect to cloud based repositories to store backup data offsite. This connection is completed over the internet with no requirement for site-to-site VPN’s or any direct communications. This connection is encrypted over the internet to your service provider and secured via login credentials.
Veeam Cloud Connect allows backups to be removed off site under the same management console as primary backups. Schedules can be created to ensure that offsite data completed automatically, and then provide reports on the success of the job.

carl image 2

 

Concorde Cloud Solutions are offering free trials for this technology to all customers. If you’d like to learn more about how Veeam Cloud Connect can help you with your Data Protection Strategy please contact us on 03331 300600 or email enquiries@tctg.co.uk.

Author: Carl Mcdade, Concorde Solutions Architect

 

Concorde Cloud Solutions

Internet user growth is booming—3 billion people on social media alone

 

AdobeStock_104631331.jpeg

If you thought you couldn’t handle any more social media platforms or friend updates on your Instagram feed, you’re not alone. In a collaboration between We Are Social and Hootsuite, a new Global Digital Snapshot shows that the number of people using social media around the world has just passed 3 billion. Mashable reports that this is about 40% of the global population.

Some other interesting numbers for the August 2017 findings show that there are more than 5 billion unique mobile numbers, and 2.7 billion mobile social users. This means much of social media interaction is done on mobile.

The Next Web writes that growth trends for social media are rapidly increasing as well—growing at a rate of one million new users per day over the last quarter. The top categories of apps that users gravitate towards are communication apps, content, games, travel, and shopping. Other top app themes include food, fitness, and finance.

Social media is just one chunk of larger, overall Internet trends. Cisco contributes to discovering the latest phases of digital transformation by predicting traffic projections in their Visual Networking Index (VNI). With these predictions, we can easily see how massive and disruptive the Internet will be in the years to come.

In the most recent VNI report, Cisco found that 58% of the global population will be Internet users by 2021, and that the average Internet user will generate 61GB of Internet traffic per month by the same year. That is a huge 155% increase of data from 2016. Moreover, Internet video traffic—business and consumer—will be 80% of all Internet traffic by 2021.

Because so much of social media is done on mobile, there are also big predictions for the mobility market. Cisco forecasts that global mobile data traffic will increase sevenfold from 2016 to 2021, and that there will be 5.5 billion mobile users in 2021 (which is up from 4.9 billion in 2016).

To learn more about Cisco VNI and to check out more Internet and mobility predictions, click here.

Author: Stephanie Chan, Editorial and Video Producer at Cisco

Used with the permission of http://thenetwork.cisco.com/.

 

Want to Be a Data Visionary? Change the Conversation

AdobeStock_121483111.jpeg

What do customers really want? What do they actually need?

If you’re like me, you’ve been trying to answer these questions every day for pretty much your entire professional career. Every conversation you have with a customer is an exercise in peeling the onion—listening to them, trying to understand their unique problems, and eventually getting to the core issues that they are looking for you to solve.

I’ll give you an example. How many times have you heard a customer ask, “Is the cloud right for me?” As IT professionals, we know that the cloud is great. It has a lot of potential, and it can be an extremely valuable tool in developing and bringing solutions to market. And because it’s the shiny new toy in the market, everyone is clamouring to find out how they can use the cloud to do things better than their competition. But as time and experience have shown, we also know that it’s not right for everyone (or everything). So how do we approach this conversation?

Here’s an idea: listen to your customers. They will tell you exactly what they need if you give them the chance. But there’s a twist: you have to ask the right questions.

The world of IT has changed. Customers don’t care about infrastructure and systems anymore. What they care about is their data. They want flexibility, choice, security, and control at a cost that works for their budgets—they couldn’t care less what the underlying storage looks like. They don’t want to hear a load of technology terms thrown at them. Because we’re now talking to CxOs, we’ve got to learn how to speak their language. These people care about business value. What are the outcomes? How is what you’re selling going to help them grow their business?

When you change the conversation to talk about data, you’ll start to see the lights come on. You don’t even need to mention NetApp (or any vendor or technology name for that matter). It’s about asking the right questions. What do you want to do with your data? How do you want to use that data to help you grow as a business? What type of data are you collecting? You’d be surprised what you can find out when you keep the conversation focused on them and their data requirements.

In my “Is the cloud right for me?” example, my customer was looking at modernizing its ERP application. Instead of going back and forth between going all-in with cloud or keeping it on-prem like countless other vendors had done, we started by asking them about their data and how they want to use it. Turns out their primary concerns were pretty standard: governance, security, performance, and quality of service. But none of the proposals that had been put forward were ideal for what the customer was trying to do. That’s because the other vendors had been trying to sell the customer on something they didn’t need, based on a conversation that didn’t focus on actual data requirements. By positioning a solution and a strategy, not just a new piece of kit, we were able to provide the customer with exactly what they were looking for, without compromising.

Of course, at the end of the day, you’ve still got to have something to sell. The solution that we positioned was ONTAP Cloud, and the strategy is Data Fabric. Without even mentioning NetApp, we were able to figure out what the customer was really looking for and how it was using data. Once we peeled back the layers of the conversation and discovered those key requirements, positioning NetApp solutions was simple and natural because you’re not trying to put a square peg in a round hole.

NetApp gives me the scope to widen that conversation. Whether you’re a reseller or a partner, NetApp enables you to act like a service provider, and to help your customers do the same thing. With NetApp, you’re not just selling disparate pieces of gear: you’re selling an ecosystem, a portfolio, and a strategy that your customer can build on for the future.

By talking about data, you can up-level the conversation from just another “me too” technology bidding war. Put yourself in the customer’s shoes. It may sound like common sense (because it is), but I’m always surprised at how often people forget. NetApp gives you the tools to be a data visionary for your customers. But just because you have the world’s best hammer, it doesn’t mean every customer is a nail. Take the time to listen. Ask the right questions. Be the partner they need you to be. And when you’re finally ready to talk tech, NetApp is here to help.

Author: Mark Carlton, Group Technical Services Manager

Is Your Data Protected In The Cloud?

Every day businesses are changing the way they deliver applications to their users. Applications traditionally delivered on-premises are increasingly taken as a service from Software-as-a-service (SaaS) providers such as Microsoft.

It is estimated that the use of SaaS software will grow at a rate roughly 5 x that of On-premises solutions. Now you’re probably thinking these are just marketing figures, but in my experience over the last year, the conversations I am having with businesses show that there is a real interest and push towards SaaS both for functionality and commercial reasons.

The most prominent of these SaaS solutions is Office 365. Businesses are deploying O365 to provide email and collaboration services to their users.

However, there has been one worrying aspect of SaaS deployments that I have noticed and that is the risks that come with putting your data in the cloud.

We know public cloud providers have robust disaster recovery capabilities with multiple data centre and replications but native backup is something some providers lack and it’s assumed in most cases that this included as part of the SaaS service you are paying for. It often comes as a shock when we find out that in some cases it isn’t and the next questions that always follows are “how can I backup my data?”

There are a number of tools in the market that can provide the ability to backup your email from Office 365 but what about your other applications and what happens if you don’t have somewhere to back it up too.

That’s where NetApp can help with their new Cloud Control software

Cloud Control provides business with the ability to backup and protects their cloud based data. It gives businesses the unique tools to be able to take the data they have within O365 and back it up to a secondary location.

image 1

 

One of the key things for me is that this provides a business with flexibility and choice. Cloud Control provides you with multiple deployment scenarios today.

  • Back up your Office 365 data to the Cloud Control storage as part of your solution created and managed by Cloud Control, this is an AWS S3 which provides cloud-to-cloud backup.
  • Bring your own license and back up your Office 365 data to your AWS S3 storage or you can use StorageGrid Web scale solutions as the backup target, which provides cloud-to-cloud backup whether that be public or private.

 

image 21

Cloud Control is a full SaaS application there is no need for agents to be deployed, no software to install and no infrastructure required, making it easy to deploy and manage.

But Cloud Control doesn’t just help you protect your data through backups, it also provides multiple layers of operational and physical security.

Strong encryption:
Cloud Control protects data at rest with 256-bit AES object-level encryption with the unique encryption key.  All data in transit is also protected with Secure Socket Layer (SSL) encryption.

Intrusion detection:
Cloud Control environment constantly guards against intrusion with real-time monitoring, detection, and alerting.

Controlled access:
Access to a production environment is granted only to a dedicated operations team who has specific operational requirements. Changes to the production environment are tracked and audited.

In summary to me, Cloud Control can provide a flexible, Secure, efficient and cost effective solution for your SaaS applications.

If you’d like to learn more about NetApp Cloud control or see it in action please call one of our experts on 03331 300600 or email groupsales@tctg.co.uk.

 

Author: Mark Carlton, Group Technical Services Manager

Wait, this email isn’t for me – what’s it doing in my inbox?

AdobeStock_114755712.jpeg

For as long as email has been in the mainstream, stories abound about how messages have reached the wrong recipient to embarrassing or detrimental consequences. Perhaps a miss-sent shipping notification from a retailer isn’t a big deal, but a financial email containing sensitive information definitely shouldn’t land in the wrong inbox.

Recently this topic came up on Ask Slashdot via user periklisv, with the pointed question: What do you do when you get a misdirected email?

Over the past six months, some dude in Australia (I live in the EU) who happens to have the same last name as myself is using [my email address] to sign up to all sorts of services… how do you cope with such a case, especially nowadays that sites seem to ignore the email verification for signups?

The thread is full of anecdata of emails sent to the wrong recipients, often full of embarrassing or sensitive information — bank statements, loan information, lawyer correspondences.
A quick search reveals that this issue comes up in the news on a larger scale with some frequency. For example, in 2012, a company accidentally emailed an employee termination notice to all of their 1,300 global employees instead of just one. Thankfully, people quickly caught on that this email wasn’t meant to go on blast (unfortunately for the person who was still fired).
These mistakes, though rather innocuous, are usually made by someone omitting a character, making a typo, or mixing up domain names or extensions (.com instead of .net, Yahoo instead of Gmail) in a rushed moment, are usually resolved by a quick “hey, you sent this to the wrong person” reply.
But what happens if a misdirected personal email lands in the inbox of someone who might not be so honest? Or what happens when a large company sends out confidential information via email to unintended recipients?
Just one example: a representative from Rocky Mountain Bank sent sensitive customer loan information to the wrong recipient via email and sued Google to try to quash the breach and keep the data from spreading any further. (Luckily for the employee, it turned out that the unintended recipient marked the email as spam and never even looked at the email.)
That’s a data breach thanks to a simple typo. In theory, this should be easy enough to avoid.
But this isn’t a new problem. In fact, in 2011, several security researchers highlighted exactly how an enterprising criminal could typosquat on a number of domain names to wait for confidential information to come across from misdirected emails, like a trapdoor spider waiting for its prey. The researchers captured more than 20GB of data from 120,000 misdirected emails meant for Fortune 500 companies in the span of six months.
The difference between the legitimate email addresses and the ones used by the security researchers? A simple dot — that’s all.
As with so many security issues that are ultimately based on habit and human error, mitigating this issue can be easier than done. Many people know they shouldn’t send sensitive information via email, but inevitably some do it anyway out of (what they see as) necessity.
Of course, robust data and email policies to filter and/or block confidential information from egressing via email can certainly help. There are additional technical approaches we would also recommend:
Email verification for signup forms: People are in a hurry and make mistakes. It’s always going to happen. As identified by the Slashdot poster, the simple step of adding an email verification step to a sign-up process would do much to reduce misdirected emails.

Make it easier to for employees to stop hitting the “attach” button: We follow the path of least resistance — if it’s too difficult to collaborate or share by any other method, people will stick with what they know and what’s fastest. Centralized file repositories internally or in the cloud (like Dropbox), when implemented well, can make using email attachments less appealing by comparison.
Encrypt: Another possible failsafe is to encrypt everything that’s outgoing – that way even if the email does end up in the wrong hands, there’s not much the recipient can do with it.
Are misdirected emails an issue where you work? Have you managed to make them an issue of the past? We welcome your thoughts or tips on how to mitigate this issue in the comments.

 

Author: Maria Varmazis, Naked Security Author (Sophos)