Reliability Scalability Maintainability Data Engines as developers see it; database, ccaches, search index, stream processing, batch processing. How the data is distributed in disks encoding data Reliability - anticipate faults and can tolerate them. Faults - component failure. failure - system failure. hardware faults. software faults. human errors. telemetry using interfaces, decoupling, testing Scalability - Load parameters - request to web server - read or write to database - cache hit rate - active users could be average case or small number of extreme cases Twitter example : 4.6K to max 12k writes per user, but 300K reads per user. So work is done pushing writes to individual users caches at write time so read time can be faster. write times become a challenge when they involve so much leg work. still done within 5 seconds. Now twitter does a hybrid model where most tweets follow above approacch, but celebrity tweets are sent at read time. Performance : throughpu...
So I've roughly navigated this course and it was quite a challenge going through the theory because once the architectures were discussed it was just a whole bunch of clicking through in AWS and that can get boring. Architecture : For the architecture itself, the idea is a back office and a front office. The back office is all the sources and individual processes that bring in data. The Data warehouse simplifies these schemas into (possibly) a star model and then makes it easier and faster to use for the analytics / BI division (front office). There are a few variants, where BI can directly access the main source, where DW can be department specific, or unique for each department still maintaining integrity among common columns. Cloud / AWS : Doing this in the cloud provides quicker start time, elasticity, scalability. The general idea is to read from sources and move to a staging S3 bucket and then push to a DW. For smaller tables it might be possible to directly use an EC2...
< 2 years back - need to brush up> This was a good intro to relational modeling, data warehousing with fact and dimensions. using group sets and cubes for faster analytics, slicing, dicing, drill down and roll up. The project was a whole bunch of transformations converting a relational model to a data warehouse model. There were also a bunch of methods used to create schemas repetitively. When I brush up this course, I will update the learning here along with code snippets.
Comments
Post a Comment