Segmentation fault

Larry: Centos and Fedora run a libc with more debugging. You’ll need the MALLOC_CHECK workaround until this is fixed in php.

I’ve tried on centos with php 5.1.6, 5.2.9. osx with php 5.2.9, win32 php 5.2.9, and now ubuntu with php 5.2.9. segfaults with all of them. I suspect that there is bad data in the database and piwik is not handling it properly.

If you haven’t pruned your tables, you might consider: backing up (or copying to another instance) your piwik database, dropping the archive tables, and then using archive.sh to re-create your archives.

Dropped all archive tables and re-ran archive.sh and the segfault still happens. I narrowed it down to the Action plugin, when archiveDay calls updateActionsTableWithRowQuery. Disabling the Actions plugin prevents the segfault.

Clearing piwik_log_action.name column prevents the segfault as well.

Hi,

I’ve done a bit more work on trying to isolate this bug, using the same data set as frabcus. I’ve had some success in tracking it down further, and got a fix but not a definitive answer as to what the issue is.

Basically, the Actions plugin in piwik makes nested arrays to represent different ‘actions’ (url segments) in the logs of site visits. These then get made into DataTables, with various different sorting and limiting filters being applied to them.

The situation that I think causes the segfault is that there’s a url with more than 10 slash-separated path segments in it, on a day when there are also more than 100 rows to record and archive. The url causes 10-deep array to be built (which is normally fine) but then the fact that there are more than 100 records causes filters to be applied to sort the records in the resulting data table, remove some, and add a summary row. This process somehow corrupts the data table such that any subsequent attempt to access rows in it causes a segfault (or on OSX, a bus error). It’s fairly icky as the filter classes are generated on the fly and I think the problem may be something to do with several on-the-fly created classes holding a reference to this deeply nested array with objects in it. For now, the problem seems solved by unsetting a reference to it in the final filter after use.


--- core/DataTable/Filter/AddSummaryRow.php     (revision 1152)
+++ core/DataTable/Filter/AddSummaryRow.php     (working copy)
@@ -68,5 +68,6 @@
               $newRow->setColumns(array('label' =>
$this->labelSummaryRow) + $newRow->getColumns());
               $this->table->filter('Limit', array(0,
$this->startRowToSummarize));
               $this->table->addSummaryRow($newRow);
+               unset($rows);
       }
 }

I’ve also got a test that should reproduce the problem a bit more simply:


       /**
       */
       function test_serializeWithDeepNesting()
       {

         $row = new Piwik_DataTable_Row(array( 0 => array('sort_key' => 1)));

         $test_array = array('a' => $row,
                       'b' => array('c' =>
                                  array('d' =>
                                    array('e' =>
                                     array('f' =>
                                       array('g' =>
                                         array('h' =>
                                           array('i' =>
                                             array('j' => $row)
                                           )
                                         )
                                       )
                                     )
                                   )
                                 )
                               )
                           );


         $dataTable = Piwik_ArchiveProcessing_Day::generateDataTable($test_array);
         $dataTable->sort('sort', 'sort_key');
   $rows = $dataTable->getRows();
         $dataTable->deleteRowsOffset(0);
       }


Thanks for the detective work!

Is this safe all the time? I see in DataTable.php, getRows() sometimes returns $this->rows, while other times returning a new array.

I think it should be safe, as only the locally held reference gets unset. The example below shows the array still intact in its originating class:


class TestClass{
  protected $rows = array('a', 'b', 'c');
  	
  public function getRows()
  {  		
  	return $this->rows;
  }
}

class OtherTestClass{
  
  public function __construct($table){
    	$this->table = $table;
    	$this->filter();
  }
  
  protected function filter(){   
     $rows = $this->table->getRows();
     unset($rows);
   }
}


$test_class = new TestClass();
$other_test_class = new OtherTestClass($test_class);
$rows = $test_class->getRows();
print_r($rows);

crowbot,

Thank you zillion times, your unset($rows); worked as a charm.

Still needed to set memory_limit to 256 MB (128 MB wasn’t enough), but now Piwik 0.40 (clean install which stopped working after a week on our website) works again.

So can confirm that your fix removes that SEGMENTATION_FAULT.
Didn’t see any negative consequences.

Best Regards,
Beat

FYI: this isn’t in the 0.4.1 release.

I’ve created a ticket in trac: http://dev.piwik.org/trac/ticket/824

FYI This is still being investigated as I haven’t been able to reproduce the segfault or memory leak. (In fact, when I add the unset(), memory_get_usage() is higher than without.) I also tried increasing the max recursion level to the maximum (15) in test_serializeWithDeepNesting().

Hello Joe,

I had the exact same issue but I updated my PHP version and that solved the issue for me.

Hope this helps

-Svet

The patch with unset($row) fixed it for me!

I committed this “fix” to svn but was not able to reproduce the crash.

Out of curiosity. Is your php built with or without the Suhosin patch?

[quote=vipsoft @ Jul 22 2009, 03:57 AM]I committed this “fix” to svn but was not able to reproduce the crash.

Out of curiosity. Is your php built with or without the Suhosin patch?[/quote]

I dont believe it is patched with Suhosin. I think it was installed from Centos php rpm module. The segfault was also happening on some random ubuntu version of php as well.

Piwik segmentation fault FAQ with solution list now published. Check it out: Troubleshooting - Analytics Platform - Matomo

Switching to archive.php fixed this for me after upgrading php, turning off php caching modules, and raising the memory limit didn’t.