Data Storage Solutions Tue, 29 Jun 2021 12:16:37 +0000 en-GB hourly 1 Data Storage Solutions 32 32 Fujitsu RX2540 M6 Server Tue, 29 Jun 2021 11:59:39 +0000 Servers for a Data Driven World

The new Fujitsu RX2540 M6 is designed to be a high performance, small footprint server. It supports dual sockets providing the ideal balance of density and scalability using the new EDSFF drives. Processing power is handled by the new 3rd Generation Intel® Xeon® Scalable Processors with up to 40 cores per CPU all housed in a dense 1U form factor.

Fujitsu RX2540 M6

Fujitsu RX2540 M6 Specification Overview

The Fujitsu RX2540 M6 supports up to 64 EDSFF drives of 4TB providing 256TB of RAW storage space, alternatively you can choose 24x 2.5″ SAS/SATA/NVMe drives of up to 30.72TB capacity or 12x 3.5″ SAS/SATA of up to 144TB capacity.

With a choice of 1 or 2 processors from Intel® Xeon® Silver 43xx processor / Intel® Xeon® Gold 53xx processor / Intel® Xeon® Gold 63xx processor / Intel® Xeon® Platinum 83xx processor.

With up to 8TB of memory using up to 32 (16 DIMMs per CPU, 8 channels with 2 slots per channel) of DDR4 3200MHz memory or up to 12TB using Intel® Optane™ persistent memory.

Visit our web page for more details:

Identifying Unstructured Data Tue, 29 Jun 2021 08:13:26 +0000

Unstructured Data is the biggest headache today for any organisation trying to control and manage data. The unstructured data consumes over 70% of all information stored and is growing at 61% per annum!

Reduce Backup Times by 80%, by backing up hot data!

Unstructured Data

Firstly, let us understand what we are dealing with. Unstructured data is the information which is typically not stored in a database.  

Unstructured Data manifests itself in two ways:

  1. “TEXT” can be e-mail, texts, word documents, presentations, messaging systems, Twitter, Facebook etc.
  2. “RICH MEDIA” can be images, sound files, movie files etc.

As we have explained, unstructured data consumes vast amounts of storage, but another consideration is legislation. Where this data resides is important if you need to retrieve the information for a compliance audit or lawsuit.


  • data can be of any type 
  • not necessarily following any format or sequence 
  • does not follow any rules 
  • is not predictable

Discovery of Unstructured Data

How organisations identify this data is of vital importance to find whether it has an intrinsic value to the business or the next lawsuit waiting to happen. Unstructured data resides in many places, desktops, laptops, servers, NAS, SAN, Cloud and it is growing fast, very fast!

By 2025 IDC estimates we will be creating 463 EB (Exabytes) of data daily or 168 ZB annually, this is 4-5x the increase over 2020 estimates.

Firstly, we need to identify the types of unstructured data and where it currently resides. From this we can make plans to carry out the following:

  1. How much unstructured data do we have?
  2. How many copies of the same file do we have?
  3. On which systems and data storage platforms does the information reside?
  4. When was it created?
  5. When was it last accessed?
  6. What size is the file data?
  7. Who owns the files?
  8. When was it last modified?
  9. Is the data relevant to the business?
  10. How many copies do we have?
  11. Do the files need to be archived?
  12. Should the data be restricted?
  13. Who is generating this data?
  14. Is the data ours?

Existing IT Investment

Companies spend a huge amount of money in purchasing storage and servers. The investment in the solutions is growing year on year. Recent reports indicate that by 2025 we will be purchasing two to three times as much storage capacity as we are today, whether this is cloud storage or on-premise, the data management issues aren’t going away.

By implementing a tiered data archive containing unstructured data and moving this through the different storage tiers frees up valuable disk space on the most expensive highest performing storage.

By moving this data, we can slow down the necessary and ongoing investment in purchasing tier 1 storage giving a huge ROI benefit.  An additional benefit with active archiving is that you may be able to utilise your existing older storage systems to archive data.

When storing unstructured data, it is an important consideration where it’s stored.  Managing unstructured data will consume increasing amounts of the IT budget and available resources due to the explosion in data growth.

Data Archiving Benefits

  1. Cost savings
  2. Energy savings
  3. ROI savings
  4. Decrease Backup times
  5. Free up valuable Tier 1 disk space
  6. Non disruptive to users
  7. Enable identification of data for business governance

Download our Infographic on Unstructured Data


For a free no obligation assessment of your unstructured data please call or email using the details below:

Call us on 01256 331614 or email:

Thanks for reading

]]> 1
Fujitsu RX2530 M6 Server Wed, 23 Jun 2021 11:56:20 +0000 Servers for a Data Driven World

The new Fujitsu RX2530 M6 is designed to be a high performance, small footprint server. It supports dual sockets providing the ideal balance of density and scalability using the new EDSFF drives. Processing power is handled by the new 3rd Generation Intel® Xeon® Scalable Processors with up to 40 cores per CPU all housed in a dense 1U form factor.

Fujitsu RX2530 M6

Fujitsu RX2530 M6 Specification Overview

The Fujitsu RX2530 M6 supports up to 32 EDSFF drives of 4TB providing 128TB of RAW storage space, alternatively you can choose 10x 2.5″ SAS/SATA/NVMe drives of up to 15.36TB capacity or 4x 3.5″ SAS/SATA of up to 12TB capacity.

With a choice of 1 or 2 processors from Intel® Xeon® Silver 43xx processor / Intel® Xeon® Gold 53xx processor / Intel® Xeon® Gold 63xx processor / Intel® Xeon® Platinum 83xx processor.

With up to 4TB of memory using up to 32 (16 DIMMs per CPU, 8 channels with 2 slots per channel) of DDR4 3200MHz memory or up to 10TB using Intel® Optane™ persistent memory.

Visit our web page for more details:

Quantum LTO Tape Thu, 08 Apr 2021 14:19:13 +0000 The Quantum LTO Tape MR-L8MQN-01 stores 12TB native or 30TB compressed. We also provide LTO-7 to 5, WORM, Cleaning Tapes and Barcode Labels. Quantum are one of the original founders of the Ultrium consortium along with IBM and Seagate. At this time they had their own format in the form of DLT (Digital Linear Tape).

Media Compliance

Quantum’s Ultrium media is compliant with all media integrity analysis utilities. You can add another level of protection throughout the lifecycle of your data with the Advanced Media Usage Report and Extended Data Life Management (EDLM) features available in Ultrium media. Media analysis and usage reports provide a view of the media condition for an entire media pool used within a specific tape library, even those in off-site storage. EDLM is a unique Scalar® automation feature designed to ensure the media integrity of cartridges placed in long-term archival or vaulted storage that are no longer used during normal operations. When your critical data is involved, it’s important to have a complete health record of your media.

Quantum LTO Tape

We provide media in WORM format from LTO 4 > 8 and meet today’s stringent regulatory and compliance requirements. WORM cartridges physically prevent data from being overwritten or tampered with once it is written to the device—providing a reliable, accurate, and scalable data integrity solution.

Additional Benefits

They continuously performs additional process and quality tests on its Ultrium data media, demonstrating our continuing commitment to provide the highest quality media. Certification focuses on five key user metrics: capacity, transfer rate, servo, and green media characteristics, with specific emphasis on reliability of written data.
Have you ever spent hours unpacking, labelling, and loading media into your library? Why waste IT resources unpacking boxes and labelling cartridges? Your time and your data are valuable. We understand that.

  • Allow us pre-label your data cartridges with your requested bar code label sequences and colour scheme. They will arrive ready to load into the library.
  • No need for individual cartridge cases? No problem! We offer “library packaging” to save you time.
  • This eco-friendly bulk packaging option protects cartridges during shipment and storage without individual cartridge cases. This allows quicker access to the cartridges to expedite loading into automation slots/trays and it results in less waste.

Combine bar code labelling with library packaging and save even more time!

Type M formatted Media

The Ultrium 8 drives include a new format feature enabling customers to increase the capacity of new or unused Ultrium 7 cartridges by up to 50 percent to store 9 TB (uncompressed) or 22.5 TB (1) (compressed) of data. To unlock this additional capacity, customers need drives with a utility or application to up-format new or unused Ultrium 7 cartridges with properly attached “M8” media ID bar code labels.

Quantum LTO Tape Datasheet

If you want to purchase Quantum Media please click here.

1 Based on 2.5:1 compression
2 Based on 2:1 compression

Lenovo ThinkStation P620 – Blazingly Fast Thu, 03 Sep 2020 13:31:56 +0000 Lenovo have designed and built the world’s fastest single processor workstation running the AMD Ryzen Threadripper Pro, available now the new Lenovo ThinkStation P620 enables support for qty 2 NVIDIA Quadro RTX-8000 or 4 NVIDIA Quadro RTX-4000 GPU cards.

Lenovo ThinkStation P620 features

Designed for professionals across multiple industries including:

  • Media & Entertainment
  • Architecture, Engineering & Construction
  • Manufacturing & Product Development
  • HealthcareFinancial Service
  • Science
  • Artificial Intelligence (AI)

The Lenovo ThinkStation P620 supports a full list of ISV’s including: Adobe, AVID, Autodesk, Dassault, Siemens to name a few. With support for up to 1TB DDR4 3200MHz memory, support for 64 cores, 128 lanes of PCIe 4.0 bandwidth.

Editing information locally is also supported by the Lenovo ThinkStation P620 by providing up to 20TB’s of storage internally, allied to a 10GbE port to connect to your existing infrastructure.

With a choice of AMD Threadripper™ PRO processors from 12 to 64 cores, the Lenovo ThinkStation P620 can support any application to complete projects faster than any other competing processor.

The Lenovo P620 competes very favourably against the competition in performance benchmarks for the Threadripper Pro 3945WX 12 core runs Cinebench 12% higher in single threaded mode, but 28% higher in multi-threaded mode.

If you would like to know more, please follow the link below:

Remote Server / Edge Server Mon, 21 Oct 2019 12:23:02 +0000 What is it?

An edge server or remote is typically a server that allows for remote connectivity, and can also be a server which needs to operate in a remote location. 

The edge server does both of the above. 

Edge Server

An edge server is designed to sit on the edge of a network either LAN or WAN to provide connectivity locally, remotely or both to applications and content to the client machines.  

An example of where an edge server being needed would be in a field office where the only connectivity is 4G LTE (5G coming), thus allowing connectivity to the internet or main office with the ability, locally to provide the compute, network and storage.

Houston, we have a problem!

Cooling a server normally requires an air conditioning unit as servers generate heat and can normally be generating in excess of 2,200 BTU/hr okay in the winter, but not so good in the Summer.

Therefore, data centre servers aren’t the right solution whereas a small compact, powerful and secure remote server would fit the bill perfectly.

Luckily, there’s a company out there who address the above problems and are setting a hot trend, and we can see why…

My bad, we’re good.

Lenovo have produced the Lenovo SE350 Edge Server, a fantastic remote server that supports an Intel-D processor with up to 16 cores. 

It is ultra-small h: 1.7” x w 8.2” x d 14.8”, you can stack them horizontally/vertically, on a wall/ceiling or mount them in a computer rack, they support dual power inputs of 240W each and generate 783 BTU/hr using dual power supplies or 48v DC. 

Connect using Ethernet 10GbE, 1GbE or 100Mb, Wirelessly or 4G LTE.  It also runs full versions of Microsoft Windows Server 2016/2019, Red Hat Enterprise Linux, VMware ESXi, Ubuntu Server, SUSE Linux Enterprise.  With support for up to 10 M.2 drives and hardware RAID if required.

The Lenovo SE350 Edge Server, the most versatile remote server, also has a bunch of physical and electronic security measures to protect the data at the remote office, including Kensington lock and locking front bezel.  As well as being rugged and secure for true mobility as well as being tamper resistant and encrypted.

The applications or use cases are numerous, video surveillance such as facial recognition, ROBO, factory floors, transport and field offices are just a few ideas.


If you work for a business that has multiple remote offices/stores, and are having difficulty with connectivity and want to deploy a server remotely then we have the solution for you.  

At Fortuna Data we are a Lenovo partner, talk to us now about how this next generation of technology is really making a big impact in the way businesses work.

If you want to find out how we can help, call us on 01256 331614 or email

Smart Strategic Thinking

Visit our web page for more details

Composable infrastructure – the new kid on the block! Thu, 17 Oct 2019 15:07:12 +0000 Information technology agility today is key to running the next generation of applications and programs. 

For too long a business has relied on the traditional 3-tier architecture approach of servers, storage and networking.  The next step was converged infrastructures where certified and approved hardware was connected using a software layer to manage various aspects of the infrastructure. 

The next phase is HCI (hyper-converged infrastructure), this uses a software defined layer to manage the whole hardware and software stack to including switch management, VM creation and deployment, data protection and security.  

HCI is a far complete solution than CI, converged infrastructure, as it scales better, is more flexible and can utilise standard x86 servers.

Composable Infrastructure

She’s a beauty

The next step in the evolution of I.T. is Composable Infrastructures, unlike HCI where you create nodes and pool them together to provide compute, network and storage, composable treats each of these components as a pooled resource.

Composable infrastructures provide the ability to automatically create or remove these pools within minutes.  No longer do you need to provision nodes to perform specific tasks.

Moving to a composable infrastructure conveniently turns your data centre into an elastic bare-metal cloud and allows you to instantly create the necessary resources your applications need. 

Build instantaneous workloads for your applications and when you’re finished, the resources are returned to pool ready for the next application.  Due to the way a Composable Infrastructure is, they’re highly scalable allowing you to manage 1000’s PB’s from a single console.  

All good in theory, but what about practicality?

The number of applications and uses that a composable infrastructure could be used for is immense, below we list some of the more common ones.

The Cloud

Purely from a financial standpoint the cloud is expensive and slow when uploading and downloading large data sets of information to be analysed.  The high cost of cloud computing and storage services are a barrier to building a cost-effective infrastructure, whereas a composable infrastructure could provide the elasticity a business need.

Big Data and Data Mining – Employs Hadoop, NoSQL, massively parallel databases, and other applications for advanced analytics against very large data sets of structured, unstructured and semi structured data that resides in data lakes and data warehouses.  This is where composable infrastructure wins as it allows to scale when the data becomes available.

Machine Learning – This is evolving at a phenomenal rate! It seems everyone father and his son are having a crack. The technology and infrastructure required to deploy the necessary component parts are expensive and complex.  Compute and data-intensive deep neural networks require much larger data sets and high-performance technologies such as GPUs and low-latency network fabric. Production deployments, also called inference, are less computationally intensive with lower requirements for compute and storage yet may still require low latency.  Having the complete flexibility to manage resources on-the-fly provides an ideal solution for machine learning applications.

Kubernetes Containers – It groups containers that make up an application into logical units for easy management and discovery.  Designed on the same principles that allows Google to run billions of containers a week, Kubernetes can scale without increasing your ops team.

Kubernetes provides the tools needed to build and deploy reliable, scalable, distributed applications. And with the addition of storage volumes, Kubernetes solves the need for storage availability by requesting and attaching volumes to containers making the deployment of stateful applications, such as databases, in containers a popular solution.

With Kubernetes, IT can deliver advanced analytics, AI/machine learning and NoSQL databases using lightweight, flexible containers with the ability to deploy, scale and manage them seamlessly.  This was one of the first applications that composable storage was designed to use and offers instantaneous deployment of pooled resources.

Tiered Data Storage – Tiered storage assigns different categories of applications and their data to a range of storage media types. Tiering for online data ranges from tier 0 for hot data, such as transactional databases and fraud detection, to tier 3 for cold, infrequently accessed data. Tier 4 is generally for archived or regulated data stored on tape or at a cloud service. No matter the applications or how you tier your data, finding a server and storage infrastructure that adapts to your requirements is paramount.  Composable infrastructures provide the ability to automatically migrate data from HDD – SSD – NVMe or back again as and when required.

Cold Data and Object Storage – Enterprises and cloud companies are employing big data analytics and AI to gain operational insights, make better business predictions and personalise content for customers. The continuous growth of data for these uses, and many more require companies to build larger stores, keep data online longer and bring archived data back online. Much of this data is infrequently accessed, or cold data, and requires economical and efficient storage at scale.

Accurately planning how much compute and storage is needed for cold data or a growing object store, and efficiently provisioning compute and drives is an ongoing challenge.

As a result, there is growing demand for infrastructure solutions that can be flexibly designed, scale seamlessly and cost-effectively while ensuring quick and easy access to cold data whenever it is needed.  As composable infrastructures can grow from TB’s to PB’s utilising differing storage technologies makes this an ideal choice.

NVMe-oF Infrastructure – Data-driven companies are increasingly using NVMe storage for performance-hungry workloads that require low latency and high I/O throughput.

 These applications include analytics, predictive modelling, streaming, machine learning and large-volume transaction processing.  There are many advantages to flash storage, most notably high IOPS in a small footprint – a single 2RU all-flash array can replace one or two racks of HDDs when rightsizing the capacity to the required performance.  With the ability to carve flash drives into slices that can be attached to individual compute nodes, flash utilisation can be increased dramatically. In addition, reduced energy requirements and higher reliability make flash an ideal data centre solution.

NVMe over Fabrics (NVMe oF), a new high-performance networking solution, eliminates the performance issues that required you to isolate flash inside server nodes.

The disaggregation of compute from storage creates the opportunity for composable infrastructures to orchestrate or recreate the server and storage platform in a completely new way.

Future Possibilities

Today, most data-intensive computing workloads are run on direct-attached storage, to provide performance and capacity.  Whilst this is ideal for handling large data volumes it is complex to install, manage and deploy, and normally over provisioned or underutilised. 

The complexity of adding additional DAS-Direct Attached Storage to the environment causes issues, even more if adding DAS from a different vendor.  The flexibility a composable infrastructure provides massively simplifies this infrastructure by allowing you to simply add additional resources as and when required.

Daily, data is being created to un-paralleled levels, the data needs to examine, analysed, mined and acted upon and ultimately stored. This is where composable allows for simple organic growth as and when required all controlled from a single GUI.   Composable infrastructures provide the agility and ability to manage any modern infrastructure, efficiently, and whilst being secured using data encryption and the ability to reduce TCO using commodity elements that scale.


If you work for a business that is producing huge amounts of data, or own huge amounts of legacy data, deploying a composable infrastructure would be the best move you can make.   

At Fortuna we can also offer your company with data mining as a service, for either a flat fee or a subscription.

At Fortuna Data we specialise in CI, HCI and composable infrastructures, talk to us now how this next generation of technology is really making a big impact in the way businesses work.

If you want to find out how we can help, call us on 01256 331614 or email

Visit our web page for more information

SAN and NAS Fri, 11 Oct 2019 09:05:55 +0000 Straight facts

Any business today needs to store data whether that data resides in the cloud or on premise a business would normally have a SAN and NAS.

If the data is on-premise it will reside on a NAS or SAN storage or both.  With the projected growth in data creation over the next 5 years there is now a huge variety of SAN and NAS storage solutions to choose from.

When choosing your storage, be sure to consider a solution that supports NVMe flash storage as this should help future proof your infrastructure for the next 5 years.

With 18TB drives now shipping, storing large amounts of unstructured data on a NAS is relatively simple.



With both SAN and NAS storage, things get really interesting when we look at how the data is protected.

A traditional method for storing data would be to use a RAID, but high capacity disk drives are posing a problem as the RAID rebuild times could take 1 week to finish the rebuild! Although, I must add that some high-end RAID systems automatically detect failing drives in advance and start migrating data intuitively.

Fear not all

A newer technology that is being deployed on both SAN and NAS storage products that is replacing RAID (Redundant Array Inexpensive Disks) is software defined storage where an object store is used.

Object storage uses individual servers with computer, networking, memory and storage in a cluster. 

So rather than buying a big pool of storage upfront, object storage works by starting out with typically 3-nodes and adding more nodes as demand increases.  The data is also distributed throughout the nodes and therefore provides a higher level of data protection over RAID.

If you need capacity could have 4-nodes with 32x 16TB drives with an extra sprinkling of NVMe spice for performance.  The capacity would be somewhere in the region of 512TB just using hard disks!  

All fun and games, but what about backup?

512TB backup will be a walk in the park…

Well actually, it is, in fact it’s a walk in the park on a warm summer’s day.

Converged Storage, makes backup a seamless task, and combines all of the following.

  • A scale out Object Store
  • NAS storage providing NFS & SMB
  • Complete DR across multiple sites
  • Data encryption at rest
  • Backup for your entire physical and virtual estate
  • Guaranteed SLA, RPO & RTO
  • Replication from One-2-Many or Many-2-One sites
  • ROBO
  • Backup locally or to the cloud
  • Built in data analytics
  • Simple to install and manage
  • Simplified site licensing and support

We know lots and love to share our knowledge

During the next 5 years the data storage landscape will change using NVMe, PCIe 4.0 and NVMeoF.  Data is now doubling every 2 years and likely to increase with new applications and emerging technology such as 5G and IoT devices. 

If you want to find out how we can help, call us on 01256 331614 or email

If you want to know more read our web page on SAN vs NAS

IT Consultancy Service Tue, 08 Oct 2019 13:19:01 +0000 We provide an IT Consultancy Service and aim to deliver the right level of IT consultancy by choosing the most relevant IT consultant that fits your needs and requirements. Our specialist expertise enables us to deliver the results within a time-frame to help with project delivery, further expertise or guidance at a tactical level to support the businesses plans.

IT Consultancy Service

To find out more on our IT consultancy service and let us deliver a world class IT infrastructure, please visit the link below.

IT Consultation

All Flash NVMe Storage Wed, 02 Oct 2019 14:14:12 +0000 OK, I think it’s fair to say we don’t need an Apache attack helicopter with all sorts of cutting edge, super dangerous weaponry to get an edge over our competition All Flash NVMe Storage delivers currently the fastest performance available.

Nonetheless, deploying the best of the best in terms of I.T. equipment is essential for businesses to remain competitive. The compounded affects of improper, inadequate infrastructures really has measurable negative consequences.

Any business or organisation today considering a data centre upgrade should seriously be looking at the potential that all flash NVMe storage arrays provide over all other types of storage.  Traditional SAS, SATA and Fibre Channel connected storage all require an HBA card or controller that sits between the storage and CPU, this is where NVMe storage differs.  NVMe flash directly addresses the CPU negating the need of an HBA card. 

  1. Performance – Using NVMe is up to 53x faster than hard disks and at least 5x faster than SSD.
  2. How does it work? NVMe flash storage uses PCIe 3.0 or 4.0 to directly connect the CPU, unlike SAS/SATA/Fibre Channel that require an HBA (Host Bus Adapter)
  3. What does NVMe stand for? Non-Volatile Memory Express
  4. How can NVMe flash connect to our infrastructure? NVMe-oF (NVMe over Fabric using fibre channel), NVMe over Infiniband, Ethernet (RoCE and iWARP). 

All flash NVMe storage performance

NVMe uses multiple PCIe bus lanes.  Each lane has 2 pairs of wires to send and receive data.

PCIe 3.0 – Supports one, four, eight or sixteen lanes in a single PCIe slot, denoted as X1, X4, X8 or X16.  Therefor the maximum performance for PCIe 3.0 is approximately 1GB/s per lane x 16 x 2 = 32GB/s in a single PCIe slot.

PCIe 4.0 – Supports one, four, eight, sixteen or 32 lanes lanes in a single PCIe slot, denoted as X1, X4, X8, X16 or 32X.  Therefor the maximum performance for PCIe 4.0 is approximately 2GB/s per lane x 32 x 2 = 64GB/s in a single PCIe slot.  These cards will work in a PCIe slot but will operate at PCIe 3 speeds.

NVMe can handle 64,000 command queues and send 64,000 commands per queue at the same time, whereas an SSD or hard disk only has 1 command queue and can send a maximum of 32 commands per queue.  The NVMe to PCIe commands also require relatively low CPU cycles compared to SAS/SATA/SSD drives.  Ideally the CPU should have as many multiple cores as possible in order to sustain the transfer rates that all flash NVMe storage provides.

Over the next 5 years, all data storage systems will be using NVMe & PCIe 4.0.  With newer flash memory technologies emerging such as 3D XPoint and Optane, the future roadmap for NVMe based flash arrays is very bright.  

Get yourself a data storage Apache, get yourself all flash NVMe storage for your business infrastructure.

Smarter, Strategic, Thinking.

Want to know more about NVMe :

All Flash NVMe Storage

Visit our web page for more details: