Introducing G2.ai, the future of software buying.Try now
Verified User in Computer Software
G2
Director of Data Management at Barclays

I wanted to load 200 TB of data into Snowflake

How to scale the performance to load this faster.
2 comments
Looks like you’re not logged in.
Users need to be logged in to answer questions
Log In
Javed S.
JS
Award-Winning Founder, CEO@Lyftrondata| Speaker | Entrepreneur & Investor
0
Hello Rajeev, Analyze any data size in more than 35 visualization tools! The agile-data delivery model could process trillions of rows, tables and delivers unmatched BI performance and limitless scalability for Snowflake users. Run real-time SQL queries on any data source. Create data sets and share them between teams and analytics tools. Thanks Lyftron Data Angel
Looks like you’re not logged in.
Users need to be logged in to write comments
Log In
Reply
Rahul P.
RP
Software engineer at Barclays | Frontend | JavaScript
0
Loading from Gzipped CSV is fastest way of loading from ORC and Parquet at ~15 TB/Hour.
Looks like you’re not logged in.
Users need to be logged in to write comments
Log In
Reply