SQL Server 和 SqlDataReader - 万亿记录 - 内存

SQL Server and SqlDataReader - Trillion Records - Memory(SQL Server 和 SqlDataReader - 万亿记录 - 内存)

本文介绍了SQL Server 和 SqlDataReader - 万亿记录 - 内存的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我从未尝试过这个 - 所以我不知道我是否会遇到内存问题.

I've never tried this - so I don't know if I'd run into memory issues.

但是 SqlDataReader 可以读取一万亿条记录吗?这一切都流式传输正确吗?我对 SQL/TDS 协议在幕后所做的事情有点不理解.

But can a SqlDataReader read a trillion records? It's all streamed correct? I'm a little green to what the SQL/TDS protocol is doing under the covers.

更新将万亿翻译为非常大的数字.我可能应该说 10 亿或 1 亿.

UPDATE Translate Trillion to mean very large number. I probably should have said something like 1 billion or 100 million.

推荐答案

是的,这会流式传输...但我认为您实际上不应该尝试这样做.

Yes, that will stream... but I don't think you should actually try to do it.

如果您每秒可以读取 100 万条记录(这听起来不太可能),那么您仍然需要 12 天才能读取一万亿条记录……这需要大量的工作来冒半途而废的风险.

If you could read a million records per second (which sounds unlikely to me) you'd still need 12 days to read a trillion records... that's a lot of work to risk losing half way through.

现在我意识到你可能并不真的想从字面上阅读一万亿条记录,但我的观点是,如果你无论如何都可以将大量"工作分成逻辑批次,那就是可能是个好主意.

Now I realise you probably don't really want to read a trillion records, literally, but my point is that if you can separate your "large amount" of work into logical batches anyway, that's probably a good idea.

这篇关于SQL Server 和 SqlDataReader - 万亿记录 - 内存的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!

本文标题为:SQL Server 和 SqlDataReader - 万亿记录 - 内存

基础教程推荐