Can I get more than 1000 records from a DirectorySearcher?(我可以从 DirectorySearcher 获得超过 1000 条记录吗?)
问题描述
我刚刚注意到结果的返回列表限制为 1000.我的域(HUGE 域)中有超过 1000 个组.如何获得超过 1000 条记录?我可以从以后的记录开始吗?我可以把它分成多个搜索吗?
I just noticed that the return list for results is limited to 1000. I have more than 1000 groups in my domain (HUGE domain). How can I get more than 1000 records? Can I start at a later record? Can I cut it up into multiple searches?
这是我的查询:
DirectoryEntry dirEnt = new DirectoryEntry("LDAP://dhuba1kwtn004");
string[] loadProps = new string[] { "cn", "samaccountname", "name", "distinguishedname" };
DirectorySearcher srch = new DirectorySearcher(dirEnt, "(objectClass=Group)", loadProps);
var results = srch.FindAll();
我尝试设置 srch.SizeLimit = 2000;,但这似乎不起作用.有什么想法吗?
I have tried to set srch.SizeLimit = 2000;, but that doesn't seem to work. Any ideas?
推荐答案
您需要将 DirectorySearcher.PageSize 设置为非零值才能获取所有结果.
You need to set DirectorySearcher.PageSize to a non-zero value to get all results.
顺便说一句,您还应该在使用完 DirectorySearcher 后将其丢弃
BTW you should also dispose DirectorySearcher when you're finished with it
using(var srch = new DirectorySearcher(dirEnt, "(objectClass=Group)", loadProps))
{
srch.PageSize = 1000;
var results = srch.FindAll();
}
API 文档不是很清楚,但本质上是:
The API documentation isn't very clear, but essentially:
当您进行分页搜索时,SizeLimit 将被忽略,并在您遍历 FindAll 返回的结果时返回所有匹配的结果.结果将从服务器一次检索一个页面.我在上面选择了 1000 的值,但如果愿意,您可以使用较小的值.权衡是:使用较小的 PageSize 会更快地返回每页结果,但在迭代大量结果时需要更频繁地调用服务器.
when you do a paged search, the SizeLimit is ignored, and all matching results are returned as you iterate through the results returned by FindAll. Results will be retrieved from the server a page at a time. I chose the value of 1000 above, but you can use a smaller value if preferred. The tradeoff is: using a small PageSize will return each page of results faster, but will require more frequent calls to the server when iterating over a large number of results.
默认情况下不分页搜索 (PageSize = 0).在这种情况下,最多返回 SizeLimit 结果.
by default the search isn't paged (PageSize = 0). In this case up to SizeLimit results is returned.
正如 Biri 所指出的,处理 FindAll 返回的 SearchResultCollection 很重要,否则您可能会出现内存泄漏如 DirectorySearcher.FindAll 的 MSDN 文档的备注部分所述.
As Biri pointed out, it's important to dispose the SearchResultCollection returned by FindAll, otherwise you may have a memory leak as described in the Remarks section of the MSDN documentation for DirectorySearcher.FindAll.
在 .NET 2.0 或更高版本中帮助避免这种情况的一种方法是编写一个自动处理 SearchResultCollection 的包装器方法.这可能类似于以下内容(或者可能是 .NET 3.5 中的扩展方法):
One way to help avoid this in .NET 2.0 or later is to write a wrapper method that automatically disposes the SearchResultCollection. This might look something like the following (or could be an extension method in .NET 3.5):
public IEnumerable<SearchResult> SafeFindAll(DirectorySearcher searcher)
{
using(SearchResultCollection results = searcher.FindAll())
{
foreach (SearchResult result in results)
{
yield return result;
}
} // SearchResultCollection will be disposed here
}
然后您可以按如下方式使用它:
You could then use this as follows:
using(var srch = new DirectorySearcher(dirEnt, "(objectClass=Group)", loadProps))
{
srch.PageSize = 1000;
var results = SafeFindAll(srch);
}
这篇关于我可以从 DirectorySearcher 获得超过 1000 条记录吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!
本文标题为:我可以从 DirectorySearcher 获得超过 1000 条记录吗?
基础教程推荐
- c# Math.Sqrt 实现 2022-01-01
- MS Visual Studio .NET 的替代品 2022-01-01
- 将 XML 转换为通用列表 2022-01-01
- 为什么Flurl.Http DownloadFileAsync/Http客户端GetAsync需要 2022-09-30
- 将 Office 安装到 Windows 容器 (servercore:ltsc2019) 失败,错误代码为 17002 2022-01-01
- 如何激活MC67中的红灯 2022-01-01
- rabbitmq 的 REST API 2022-01-01
- SSE 浮点算术是否可重现? 2022-01-01
- 有没有办法忽略 2GB 文件上传的 maxRequestLength 限制? 2022-01-01
- 如何在 IDE 中获取 Xamarin Studio C# 输出? 2022-01-01