沃梦达 / 编程问答 / php问题 / 正文

php/symfony/doctrine 内存泄漏?

php/symfony/doctrine memory leak?(php/symfony/doctrine 内存泄漏?)

本文介绍了php/symfony/doctrine 内存泄漏?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在使用 symfony 1.4 和学说 1.2 将对象批量插入到数据库时遇到问题.

I'm having problems with a batch insertion of objects into a database using symfony 1.4 and doctrine 1.2.

我的模型有一种叫做Sector"的对象,每个对象都有几个Cupo"类型的对象(通常从50到200000不等).这些物体很小;只是一个简短的标识符字符串和一两个整数.每当用户创建一组扇区时,我需要自动将Cupo"的所有这些实例添加到数据库中.如果出现任何问题,我将使用一个原则事务来回滚所有内容.问题是我只能在 php 内存不足之前创建大约 2000 个实例.它目前有 128MB 的限制,对于处理使用少于 100 字节的对象来说应该绰绰有余.我试过将内存限制增加到 512MB,但 php 仍然崩溃,这并不能解决问题.我是正确地进行批量插入还是有更好的方法?

My model has a certain kind of object called "Sector", each of which has several objects of type "Cupo" (usually ranging from 50 up to 200000). These objects are pretty small; just a short identifier string and one or two integers. Whenever a group of Sectors are created by the user, I need to automatically add all these instances of "Cupo" to the database. In case anything goes wrong, I'm using a doctrine transaction to roll back everything. The problem is that I can only create around 2000 instances before php runs out of memory. It currently has a 128MB limit, which should be more than enough for handling objects that use less than 100 bytes. I've tried increasing the memory limit up to 512MB, but php still crashes and that doesn't solve the problem. Am I doing the batch insertion correctly or is there a better way?

错误如下:

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 71 bytes) in /Users/yo/Sites/grifoo/lib/vendor/symfony/lib/log/sfVarLogger.class.php on line 170

这是代码:

public function save($conn=null){

    $conn=$conn?$conn:Doctrine_Manager::connection();

    $conn->beginTransaction();


    try {
        $evento=$this->object;


        foreach($evento->getSectores() as $s){

            for($j=0;$j<$s->getCapacity();$j++){

                $cupo=new Cupo();
                $cupo->setActivo($s->getActivo());
                $cupo->setEventoId($s->getEventoId());
                $cupo->setNombre($j);
                $cupo->setSector($s);

                $cupo->save();

            }
        }

        $conn->commit();
        return;
    }
    catch (Exception $e) {
        $conn->rollback();
        throw $e;
    }

再一次,此代码对于少于 1000 个对象可以正常工作,但任何大于 1500 的对象都会失败.感谢您的帮助.

Once again, this code works fine for less than 1000 objects, but anything bigger than 1500 fails. Thanks for the help.

推荐答案

尝试过

$cupo->save();
$cupo->free();
$cupo = null;

(但替换我的代码)而且我仍然遇到内存溢出.还有其他想法吗?

(But substituting my code) And I'm still getting memory overflows. Any other ideas, SO?

更新:

我在我的 databases.yml 中创建了一个新环境,如下所示:

I created a new environment in my databases.yml, that looks like:

all:
  doctrine:
    class: sfDoctrineDatabase
    param:
      dsn: 'mysql:host=localhost;dbname=.......'
      username: .....
      password: .....
      profiler: false

profiler: false 条目禁用了学说的查询日志记录,它通常会保留您所做的每个查询的副本.它并没有阻止内存泄漏,但我的数据导入距离是没有它时的两倍.

The profiler: false entry disables doctrine's query logging, that normally keeps a copy of every query you make. It didn't stop the memory leakage, but I was able to get about twice as far through my data importing as I was without it.

更新 2

我加了

Doctrine_Manager::connection()->setAttribute(Doctrine_Core::ATTR_AUTO_FREE_QUERY_OBJECTS, true ); 

在运行我的查询之前,并更改

before running my queries, and changed

$cupo = null;

unset($cupo);

现在我的剧本一直在愉快地翻阅.我很确定这次它会在没有耗尽 RAM 的情况下完成.

And now my script has been churning away happily. I'm pretty sure it will finish without running out of RAM this time.

更新 3

是的.这是获胜组合.

这篇关于php/symfony/doctrine 内存泄漏?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!

本文标题为:php/symfony/doctrine 内存泄漏?

基础教程推荐