When we talk about big data, first thing comes in mind is amount of data or size of data or quantity of data.
Here Problem is -
How to store large amount of data ? ...No Problem....use.... HDFS
How to process large amount of data ? ...No problem....use.... Hadoop Map Reduce
Big Data volume speaks about data which is larger than normal volume of data which was so far stored and processed by traditional system.
If data is larger than normal amount of data then definitely its challenging and expensive to load and process this data. How it can be achieved, let see this in Hadoop Framework.
Data units in Big Data
Disk Storage | Processor or Virtual Storage | ||||
1 GB | 1000 MBs | 1024 MBs | 1 x 10⁹ Bytes | ||
1 TB | 1000 GBs | 1024 GBs | 1 x 10¹² Bytes | ||
1 PB | 1000 TBs | 1024 TBs | 1 x 10¹⁵ Bytes | ||
1 EB | 1000 PBs | 1024 PBs | 1 x 10¹⁸ Bytes | ||
1 ZB | 1000 EBs | 1024 EBs | 1 x 10²¹ Bytes | ||
1 YB | 1000 ZBs | 1024 ZBs | 1 x 10²⁴ Bytes | ||
1 BB | 1000 YBs | 1024 YBs | 1 x 10²⁷ Bytes | ||
1 GeopB | 1000 BBs | 1024 BBs | 1 x 10³⁰ Bytes | ||
GB - GigaBytes
TB - TeraBytes
PB - PetaBytes
EB - ExaBytes
ZB - ZettaBytes
YB - YottaBytes
BB - BrontoBytes
GeopB - GeopBytes
No comments:
Post a Comment