39
©2017 Micron Technology, Inc. All rights reserved. Information, products, and/or specifications are subject to change without notice. All information is provided on an “AS IS” basis without warranties of any kind. Statements regarding products, including regarding their features, availability, functionality, or compatibility, are provided for informational purposes only and do not modify the warranty, if any, applicable to any product. Drawings may not be to scale. Micron, the Micron logo, and all other Micron trademarks are the property of Micron Technology, Inc. All other trademarks are the property of their respective owners. Ceph in a Flash Micron’s Adventures in All-Flash Ceph Storage Ryan Meredith & Brad Spiers, Micron Principal Solutions Engineer and Architect

Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

  • Upload
    ngocong

  • View
    225

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

©2017 Micron Technology, Inc. All rights reserved. Information, products, and/or specifications are subject to

change without notice. All information is provided on an “AS IS” basis without warranties of any kind.

Statements regarding products, including regarding their features, availability, functionality, or compatibility,

are provided for informational purposes only and do not modify the warranty, if any, applicable to any

product. Drawings may not be to scale. Micron, the Micron logo, and all other Micron trademarks are the

property of Micron Technology, Inc. All other trademarks are the property of their respective owners.

Ceph in a FlashMicron’s Adventures in All-Flash Ceph StorageRyan Meredith & Brad Spiers,

Micron Principal Solutions Engineer and Architect

Page 2: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc. May 15, 20172

Solve the Storage Optimization PuzzlewithMicron

We’ve Done the Tuning for You

Consider Micron Powered Ceph Architectures

Discuss How Your Workloads Could Benefit

From 3D XPoint™ or Persistent Memory

When It’s Time – How SSDs Might

Even Be Best for Archive

Page 3: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Austin, TX

BFL - Big Fancy Lab

Real-world application

performance testing using

Micron Storage & DRAM

– Ceph, VSAN, Storage Spaces

– Hadoop, Spark

– MySQL, MSSQL, PostgreSQL

– Cassandra, MongoDB

3

Micron Storage Solutions Engineering

May 15, 2017© 2017 Micron Technology, Inc.

Page 4: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.4

Performance Comparison ofMicron-powered Ceph Architectures

May 15, 2017© 2017 Micron Technology, Inc.

Page 5: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Micron-Powered Ceph ArchitecturesPERFORMANCE COMPARISON

May 15, 20175

Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA

Completion Date March 2016 July 2016 April 2017

# of Storage Nodes 8 10 4

# of Drives/Node 10x 800GB Micron M510DC2x 800GB Micron S650DC

10x 800GB Micron M510DC

10x 2.4TB Micron 9100MAX

NVMe SSD

Raw Capacity / Node 8TB 8TB 24TB

CPU Intel 2690v3 x2 Intel 2690v4 x2 Intel 2699v4 x2

RAM 256GB 256GB 256GB

Network Mellanox 40GbE Mellanox 40GbE Mellanox 50GbE

OS Ubuntu 14.04 RHEL 7.2 RHEL 7.3

Ceph Version Ceph Hammer 0.94.5Red Hat Ceph Storage 1.3.2

(Hammer 0.94.5)

Red Hat Ceph Storage 2.1

(Jewel 10.2.3)

Page 6: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Micron-Powered Ceph ArchitecturesPERFORMANCE COMPARISON – RBD FIO 4KB RANDOM PERFORMANCE / NODE

May 15, 20176

125K IOPS

199K IOPS

287K IOPS

10K IOPS23K IOPS

60K IOPS

Micron

Ceph 0.94.5

SATA

Micron

RH Ceph 1.3

SAS + SATA

Micron

RH Ceph 2.1

9100 MAX NVMe

Micron

Ceph 0.94.5

SATA

Micron

RH Ceph 1.3

SAS + SATA

Micron

RH Ceph 2.1

9100 MAX NVMe

4KB Random Read 4KB Random Write

Page 7: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.7

Micron + Red Hat + SupermicroAll-NVMe CephReference Architecture

May 15, 2017© 2017 Micron Technology, Inc.

Page 8: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Hardware Configuration

Storage Nodes (x4):

– Supermicro Ultra Server SYS-1028U-TN10RT+

– 2x Intel 2699v4 22 core Xeon

– 256GB DDR4-2400 DRAM (8x 32GB)

– 2x Mellanox 50GbE 1-port NICs

– 10x Micron 2.4TB 9100MAX NVMe SSD

Monitor Nodes (x3):

– Supermicro SYS-1028U-TNRT+ (1U)

Network:

– 2 Supermicro 100GbE 32-Port Switches

– SSE-C3632SR

MICRON + RED HAT + SUPERMICRO ALL-NVMe CEPH RA

May 15, 20178

Page 9: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Software Configuration

Storage + Monitor Nodes

– Red Hat Ceph Storage 2.1 (Jewel 10.2.3)

– Red Hat Enterprise Linux 7.3

– Mellanox OFED Driver 3.4.2

Switch OS

– Cumulus Linux 3.1.2

Deployment Tool

– Ceph-Ansible

MICRON + RED HAT + SUPERMICRO ALL-NVMe CEPH RA

May 15, 20179

Page 10: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Performance Testing Methodology

FIO RBD for Block Tests

RADOS Bench for Object Tests

12x Supermicro SYS-2028U Load Generators

– Mellanox Connectx-4 40GbE Networking

Tests kicked off on multiple clients simultaneously

15-minute test runs x 3 for recorded average performance

results

5TB of Data on 2x Replicated pool (10TB total Data)

MICRON + RED HAT + SUPERMICRO ALL-NVMe CEPH RA

May 15, 201710

Page 11: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Drive Scaling

Drives were scaled up to determine the performance

sweet spot, a balance between CPU utilization, Network,

Storage, and Ceph

8+ OSD processes necessary to fully utilize 2x 2699v4 CPUs

TESTED IN REFERENCE ARCHITECTURE

May 15, 201711

2 Drives /

Storage Node

4 Drives /

Storage Node

10 Drives /

Storage Node

Total # of Drives 8 16 40

Total Raw

Capacity19.2 TB 38.4 TB 96 TB

# of OSDs per

Drive4 2 1

Total # of OSDs 32 32 40

Page 12: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

FIO RBD 4KB Random Read Performance

4 & 10 Drives/Node:

CPU Limited

2 Drives/Node:

Drive & CPU Limited

MICRON + RED HAT + SUPERMICRO ALL-NVMe CEPH RA

May 15, 201712

0

50

100

4KB Random Read Storage Node CPU%

2 Drives/Node

CPU%

4 Drives/Node

CPU%

10 Drives/Node

CPU%

0

500 K

1,000 K

1,500 K

2 Drives/Node

19.2TB

4 Drives/Node

38.4TB

10 Drives/Node

96TB

4K

B R

ead

IO

Ps

4KB Random Read IOPs

4KB Random

Read IOPs Latency

2 Drives/Node 745K 2.1ms

4 Drives/Node 1.13M 1.38ms

10 Drives/Node 1.15M 1.11ms

Page 13: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

FIO RBD 4KB Random Write Performance

4 & 10 Drives/Node:

CPU Limited

2 Drives/Node:

Drive Limited

MICRON + RED HAT + SUPERMICRO ALL-NVMe CEPH RA

May 15, 201713

4KB Random

Write IOPs Latency

2 Drives/Node 163K 60usec

4 Drives/Node 240K 50usec

10 Drives/Node 242K 50usec

0

100 K

200 K

300 K

2 Drives/Node

19.2TB

4 Drives/Node

38.4TB

10 Drives/Node

96TB

4K

B W

rite

IO

Ps

4KB Random Write IOPs

0

50

100

4KB Rand Write Storage Node CPU%

2 Drives/Node

CPU%

4 Drives/Node

CPU%

10 Drives/Node

CPU%

Page 14: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Rados Bench 4MB Object Read Performance

Object Read is Network Limited

MICRON + RED HAT + SUPERMICRO ALL-NVMe CEPH RA

May 15, 201714

4MB Object Read

Throughput Latency

2 Drives/Node20.7 GB/s,

166 Gbps37ms

4 Drives/Node21.2 GB/s,

170 Gbps36ms

10 Drives/Node21.8 GB/s,

174 Gbps35ms

0

5

10

15

20

25

2 Drives/Node

19.2TB

4 Drives/Node

38.4TB

10 Drives/Node

96TB

4M

B R

ead

(G

B/s

)

4MB Read (GB/s)

0

20

40

4MB Object Read CPU%

2 Drives/Node

CPU%

4 Drives/Node

CPU%

10 Drives/Node

CPU%

Page 15: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Rados Bench 4MB Object Write Performance

Object Write is Drive Limited

MICRON + RED HAT + SUPERMICRO ALL-NVMe CEPH RA

May 15, 201715

4MB Object Write

Throughput Latency

2 Drives/Node1.8 GB/s,

14 Gbps140ms

4 Drives/Node3.2 GB/s,

26 Gbps81ms

10 Drives/Node4.6 GB/s,

37 Gbps41ms

0.0

1.0

2.0

3.0

4.0

5.0

2 Drives/Node

19.2TB

4 Drives/Node

38.4TB

10 Drives/Node

96TB

4M

B W

rite

(G

B/s

)

4MB Write (GB/s)

0

200

400

4MB Object Write Drive Latency (ms)

2 Drives/Node… 4 Drives/Node… 10 Drives/Node…

Page 16: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

4KB Block Performance Summary (RBD)

4 Micron 9100MAX NVMe drives/storage node is the optimal IOPs / node

– Increasing past 4 drives marginally reduces latency and increases IOPs

Red Hat Ceph Storage 2.1 can saturate 2x Intel 2699v4’s with 8 to 10 OSDs

provided proper tuning and sufficiently fast drives

4KB Reads will saturate a 10GbE link at this performance level, 25GbE+

recommended

4KB Writes can be serviced by 10GbE at this performance level

MICRON + RED HAT + SUPERMICRO ALL-NVMe CEPH RA

May 15, 201716

Page 17: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Object Performance Summary (RADOS Bench)

Reads are always network limited with 50GbE, even with 2 drives per node

Writes are drive limited, can saturate 25GbE

– Symptom of large object writes with journals and OSDs co-located

– Preliminary testing with Kraken + BlueStore showed large improvements in 4MB writes

CPU utilization is low

MICRON + RED HAT + SUPERMICRO ALL-NVMe CEPH RA

May 15, 201717

Page 18: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Platform Notes

CPU 1 has 6 NVMe

CPU 2 has 4 NVMe + 2 PCIe x8 slots (2x 1-port 50GbE NIC)

Tuning OSD processes to reside on specific CPUs did not net a

performance gain

Good old irqbalance did a decent job evenly distributing load

50GbE is the fastest NIC for x8 PCIe, this server could not use 100GbE

– 4MB object read is the only test that would benefit from 100GbE

MICRON + RED HAT + SUPERMICRO ALL-NVMe CEPH RA

May 15, 201718

Page 19: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.19

FutureCephTestingat Micron

May 15, 2017© 2017 Micron Technology, Inc.

Page 20: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

NVDIMM: Non-volatile DRAM

8GB – 16GB capacities

– Fits in DDR4 DRAM Slot

– Super capacitor allows DRAM to dump to local flash during power outage

– Crazy fast

Possible use cases for Ceph

– Small journals (2GB-4GB) in front of NVMe OSDs (Jewel)

– Storage for RocksDB data using BlueStore (Kraken / Luminous)

FUTURE CEPH TESTING AT MICRON

May 15, 201720

Page 21: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Micron 5100 SATA SSD

8TB capacity

– Utilizes 3D TLC NAND

– 1U Storage Nodes, up to 80TB/Storage Node

Possible Ceph Architectures

– All SATA Capacity Solution

– SATA + NVMe Journals

– SATA + NVDIMM Journals

FUTURE CEPH TESTING AT MICRON

May 15, 201721

Page 22: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc. May 15, 201722

+ Bankable TCO

+ Bringing data closer to CPU

+ Exponential capacity and speed increases

+ Drastic reduction/re-investment in datacenter real estate

+ Dramatic cut in physical power / energy costs

Flash optimizes your space smarter, better, faster.

Page 23: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

What is the Impact of System ImplementationTRADITIONAL SAN TO ALL-FLASH SAN

May 15, 201723

Software Protection & Scale Array Protection & Scale

NVDIMM

NAND Flash

DRAM 3D XPoint™

NVMe SSD SATA SSD SAS HHD

NAND

Flash

NAND

Flash10K

AFA FC-SAN

NAND

Flash

Hybrid SAN

7.2K, 10K, 15K

NAND Flash

Block-Mode

NVDIMM

DDR

<100ns

3D XPoint™

NVMe

<10µs

NVMe SSD

25µs

SATA SSD

300µs

SAS HDD 10K

RPM

6ms

All-Flash Array

Fibre Channel

SAN

12ms

Hybrid Array

Fibre Channel

SAN

30ms

FASTEST FASTER FAST

2.5x

Page 24: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Array Protection & Scale

Moving into the Realm of Real AdvantagesSAN TO VSAN WITH SATA SSD

May 15, 201724

Software Protection & Scale

NVDIMM

NAND Flash

DRAM 3D XPoint™

NVMe SSD SATA SSD SAS HHD

NAND

Flash

NAND

Flash10K

AFA FC-SAN

NAND

Flash

Hybrid SAN

7.2K, 10K, 15K

NAND Flash

Block-Mode

NVDIMM

DDR

<100ns

3D XPoint™

NVMe

<10µs

NVMe SSD

25µs

SATA SSD

300µs

SAS HDD 10K

RPM

6ms

All-Flash Array

Fibre Channel

SAN

12ms

Hybrid Array

Fibre Channel

SAN

30ms

FASTEST FASTER FAST

100x

Page 25: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Array Protection & Scale

Not All Flash Built Systems are the SameAFA SAN TO VSAN WITH SATA SSD

May 15, 201725

Software Protection & Scale

NVDIMM

NAND Flash

DRAM 3D XPoint™

NVMe SSD SATA SSD SAS HHD

NAND

Flash

NAND

Flash10K

AFA FC-SAN

NAND

Flash

Hybrid SAN

7.2K, 10K, 15K

NAND Flash

Block-Mode

NVDIMM

DDR

<100ns

3D XPoint™

NVMe

<10µs

NVMe SSD

25µs

SATA SSD

300µs

SAS HDD 10K

RPM

6ms

All-Flash Array

Fibre Channel

SAN

12ms

Hybrid Array

Fibre Channel

SAN

30ms

FASTEST FASTER FAST

40x

Page 26: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Array Protection & Scale

The NVMe Advantage Can Vary By UseNVME VSAN OVER AFA SAN

May 15, 201726

Software Protection & Scale

NVDIMM

NAND Flash

DRAM 3D XPoint™

NVMe SSD SATA SSD SAS HHD

NAND

Flash

NAND

Flash10K

AFA FC-SAN

NAND

Flash

Hybrid SAN

7.2K, 10K, 15K

NAND Flash

Block-Mode

NVDIMM

DDR

<100ns

3D XPoint™

NVMe

<10µs

NVMe SSD

25µs

SATA SSD

300µs

SAS HDD 10K

RPM

6ms

All-Flash Array

Fibre Channel

SAN

12ms

Hybrid Array

Fibre Channel

SAN

30ms

FASTEST FASTER FAST

480x

Page 27: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Array Protection & Scale

What about Non Volatile RAM?NVDIMM IS DATA SAFE AND FAST

May 15, 201727

Software Protection & Scale

NVDIMM

NAND Flash

DRAM 3D XPoint™

NVMe SSD SATA SSD SAS HHD

NAND

Flash

NAND

Flash10K

AFA FC-SAN

NAND

Flash

Hybrid SAN

7.2K, 10K, 15K

NAND Flash

Block-Mode

NVDIMM

DDR

<100ns

3D XPoint™

NVMe

<10µs

NVMe SSD

25µs

SATA SSD

300µs

SAS HDD 10K

RPM

6ms

All-Flash Array

Fibre Channel

SAN

12ms

Hybrid Array

Fibre Channel

SAN

30ms

FASTEST FASTER FAST

3,000x

Page 28: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc. May 15, 201728

Summary:Solve the Storage Optimization PuzzlewithMicron

We’ve Done the Tuning for You

Consider Micron Powered Ceph Architectures

Discuss How Your Workloads Could Benefit

From 3D XPoint™ or Persistent Memory

When It’s Time – How SSDs Might

Even Be Best for Archive

Visit Us In Booth B3 For

More Information

Page 29: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.29

Thanks AllThese slides will be available on the OpenStack conference website

Reference architecture available now!

May 15, 2017© 2017 Micron Technology, Inc.

Page 30: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Micron Ceph Collateral

Micron NVMe Reference Architecure:

https://www.micron.com/solutions/micron-

accelerated-solutions

Direct Link to RA Document:

https://www.micron.com/~/media/documents/product

s/technical-marketing-

brief/accelerated_ceph_solution_nvme_ref_arch.pdf

MICRON + RED HAT + SUPERMICRO ALL-NVMe CEPH RA

May 15, 201730

Page 31: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.31

Backup

May 15, 2017© 2017 Micron Technology, Inc.

Page 32: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

FIO RBD 4KB Random Read PerformanceMICRON + RED HAT + SUPER MICRO ALL-NVMe CEPH REFERENCE ARCHITECTURE

May 15, 201732

0

10 K

20 K

30 K

40 K

50 K

60 K

70 K

80 K

90 K

100 K

600 K

700 K

800 K

900 K

1,000 K

1,100 K

1,200 K

2 Drives/Node

19.2TB

4 Drives/Node

38.4TB

10 Drives/Node

96TB

IOP

s P

er

Dri

ve

4K

B R

ead

IO

Ps

4KB Random Read IOPs vs. Ceph IOPs/Drive

60

80

100

4KB Random Read Storage Node CPU%

10 Drives/Node

CPU%

4 Drives/Node

CPU%

2 Drives/Node

CPU%

0

2000

4KB Random Read Client Network (MB/s)

10 Drives/Node

External Network

4 Drives/Node

External Network

2 Drives/Node

External Network

0

100 K

4KB Random Read Single Drive IOPs

10 Drives/Node

Single Drives IOPs

4 Drives/Node

Single Drives IOPs

2 Drives/Node

Single Drives IOPs

4KB Random

Read IOPs Latency IOPS/Drive

2 Drives/Node 745k 2.1ms 93k

4 Drives/Node 1.13M 1.38ms 71k

10 Drives/Node 1.15M 1.11ms 29k

Page 33: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

FIO RBD 4KB Random Read PerformanceMICRON + RED HAT + SUPER MICRO ALL-NVMe CEPH REFERENCE ARCHITECTURE

May 15, 201733

RBD FIO 4KB Random Read Performance is CPU Limited at 4 & 10 Drives/Node

CPU and Drive Limited at 2 Drives/Node

50

55

60

65

70

75

80

85

90

95

100

4KB Random Read Storage Node CPU%

10 Drives/Node

CPU%

4 Drives/Node

CPU%

2 Drives/Node

CPU%

600 K

700 K

800 K

900 K

1,000 K

1,100 K

1,200 K

2 Drives/Node

19.2TB

4 Drives/Node

38.4TB

10 Drives/Node

96TB

4KB Random Read IOPs

Page 34: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

FIO RBD 4KB Random Write PerformanceMICRON + RED HAT + SUPER MICRO ALL-NVMe CEPH REFERENCE ARCHITECTURE

May 15, 201734

0

5 K

10 K

15 K

20 K

25 K

100 K

120 K

140 K

160 K

180 K

200 K

220 K

240 K

260 K

2 Drives/Node

19.2TB

4 Drives/Node

38.4TB

10 Drives/Node

96TB

IOP

s p

er

Dri

ve

4K

B W

rite

IO

Ps

4KB Random Write IOPs vs. Ceph IOPs/Drive

0

100

4KB Random Write Storage Node CPU%

10 Drives/Node

CPU%

4 Drives/Node

CPU%

2 Drives/Node

CPU%

0

500

4KB Random Write Client Network (MB/s)

10 Drives/Node

External Network

4 Drives/Node

External Network

2 Drives/Node

External Network

0

1000

4KB Random Write Storage Network (MB/s)

10 Drives/Node

Storage Network

4 Drives/Node

Storage Network

2 Drives/Node

Storage Network

4KB Random

Write IOPs Latency IOPS/Drive

2 Drives/Node 163k 60usec 20k

4 Drives/Node 240k 1.38ms 15k

10 Drives/Node 242k 50usec 6k

Page 35: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

FIO RBD 4KB Random Write PerformanceMICRON + RED HAT + SUPER MICRO ALL-NVMe CEPH REFERENCE ARCHITECTURE

May 15, 201735

0

100

4KB Random Write Storage Node CPU%

10 Drives/Node

CPU%

4 Drives/Node

CPU%

2 Drives/Node

CPU%

0

500 K

4KB Random Write Single Drive IOPs

10 Drives/Node

Single Drives IOPs

4 Drives/Node

Single Drives IOPs

2 Drives/Node

Single Drives IOPs

0

50

4KB Random Write Single Drive Latency (ms)

10 Drives/Node

Single Drives Latency

4 Drives/Node

Single Drives Latency

2 Drives/Node

Single Drives Latency

100 K

120 K

140 K

160 K

180 K

200 K

220 K

240 K

260 K

2 Drives/Node

19.2TB

4 Drives/Node

38.4TB

10 Drives/Node

96TB

4KB Random Write IOPs

4KB Write is CPU Limited

at 4 & 10 Drives/Node

Drive Limited at 2 Drives/Node

Page 36: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Rados Bench 4MB Object Read PerformanceMICRON + RED HAT + SUPER MICRO ALL-NVMe CEPH REFERENCE ARCHITECTURE

May 15, 201736

0

10

20

30

40

4MB Object Read CPU%

10 Drives/Node

CPU%

4 Drives/Node

CPU%

2 Drives/Node

CPU%

0

1000

2000

3000

4MB Object Read Single Drive Throughput (MB/s)

10 Drives/Node

Single Drives Throughput

4 Drives/Node

Single Drives Throughput

2 Drives/Node

Single Drives Throughput

4MB Object Read

Throughput Latency

Throughput

/Drive

2 Drives/Node 20.7 GB/s, 166 Gbps 37ms 2.6 GB/s

4 Drives/Node 21.2 GB/s, 170 Gbps 36ms 1.3 GB/s

10 Drives/Node 21.8 GB/s, 174 Gbps 35ms 0.55 GB/s

4MB Object Read is Network Limited

0.0

0.5

1.0

1.5

2.0

2.5

3.0

10.0

12.0

14.0

16.0

18.0

20.0

22.0

24.0

2 Drives/Node

19.2TB

4 Drives/Node

38.4TB

10 Drives/Node

96TB

Th

rou

gh

pu

t/D

rive (

GB

/s)

4M

B R

ead

(G

B/s

)

4MB Read (GB/s) vs. Ceph Throughput/Drive (GB/s)

Page 37: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Rados Bench 4MB Object Write PerformanceMICRON + RED HAT + SUPER MICRO ALL-NVMe CEPH REFERENCE ARCHITECTURE

May 15, 201737

0

50

4MB Object Write CPU%

10 Drives/Node

CPU%

4 Drives/Node

CPU%

2 Drives/Node

CPU%

0

2000

4MB Object Write Client Network (MB/s)

10 Drives/Node

External Network

4 Drives/Node

External Network

2 Drives/Node

External Network

0

5000

4MB Object Write Storage Network (MB/s)

10 Drives/Node

Storage Network

4 Drives/Node

Storage Network

2 Drives/Node

Storage Network

4MB Object Write

Throughput Latency

Throughput

/Drive

2 Drives/Node 1.8 GB/s, 14 Gbps 140ms 230 MB/s

4 Drives/Node 3.2 GB/s, 26 Gbps 81ms 205 MB/s

10 Drives/Node 4.6 GB/s, 37 Gbps 41ms 118 MB/s

0

50

100

150

200

250

0.0

0.5

1.0

1.5

2.0

2.5

3.0

3.5

4.0

4.5

5.0

2 Drives/Node

19.2TB

4 Drives/Node

38.4TB

10 Drives/Node

96TB

Th

rou

gh

pu

t/D

rive (

MB

/s)

4M

B W

rite

(G

B/s

)

4MB Write (GB/s) vs. Ceph Throughput/Drive (MB/s)

Page 38: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization

© 2017 Micron Technology, Inc.

Rados Bench 4MB Object Write PerformanceMICRON + RED HAT + SUPER MICRO ALL-NVMe CEPH REFERENCE ARCHITECTURE

May 15, 201738

0

500

1000

1500

2000

4MB Object Write Single Drive Throughput

10 Drives/Node

Single Drives Throughput

4 Drives/Node

Single Drives Throughput

2 Drives/Node

Single Drives Throughput

0

100

200

300

400

4MB Object Write Single Drive Latency

10 Drives/Node

Single Drives Latency

4 Drives/Node

Single Drives Latency

2 Drives/Node

Single Drives Latency

Average Drive

Throughput

Average Drive

Latency

2 Drives/Node 929 MB/s 112ms

4 Drives/Node 772 MB/s 92ms

10 Drives/Node 479 MB/s 52ms

0.0

0.5

1.0

1.5

2.0

2.5

3.0

3.5

4.0

4.5

5.0

2 Drives/Node

19.2TB

4 Drives/Node

38.4TB

10 Drives/Node

96TB

4MB Object Write (GB/s)

4MB Object Write is Drive Limited

Page 39: Ceph in a Flash - OpenStack · PDF fileMicron-Powered Ceph Architectures PERFORMANCE COMPARISON 5 May 15, 2017 Micron SATA PoC Micron SAS+SATA PoC Micron NVMe RA ... Optimization