我需要解析 40GB 大小的 XML 文件,然后进行规范化,然后插入到 MySQL 数据库中.我不清楚需要在数据库中存储多少文件,也不知道 XML 结构.
I need to parse XML files of 40GB in size, and then normalize, and insert to a MySQL database. How much of the file I need to store in the database is not clear, neither do I know the XML structure.
我应该使用哪个解析器,你会怎么做?
Which parser should I use, and how would you go about doing this?
在 PHP 中,您可以使用 XMLReader
文档:
In PHP, you can read in extreme large XML files with the XMLReader
Docs:
$reader = new XMLReader();
$reader->open($xmlfile);
超大型 XML 文件应以压缩格式存储在磁盘上.至少这是有道理的,因为 XML 文件具有很高的压缩率.例如像 large.xml.gz
.
Extreme large XML files should be stored in a compressed format on disk. At least this makes sense as XML files have a high compression ratio. For example gzipped like large.xml.gz
.
PHP 通过 压缩对 XMLReader
提供了很好的支持包装器文档:
PHP supports that quite well with XMLReader
via the compression wrappersDocs:
$xmlfile = 'compress.zlib://path/to/large.xml.gz';
$reader = new XMLReader();
$reader->open($xmlfile);
XMLReader
允许您仅"对当前元素进行操作.这意味着它是仅向前的.如果您需要保持解析器状态,您需要自己构建它.
The XMLReader
allows you to operate on the current element "only". That means it's forward-only. If you need to keep parser state, you need to build it your own.
我经常发现将基本动作封装到一组迭代器中很有帮助,这些迭代器知道如何对 XMLReader
进行操作,例如仅遍历元素或子元素.您可以在Parse XML with PHP and XMLReader中找到这一点.
I often find it helpful to wrap the basic movements into a set of iterators that know how to operate on XMLReader
like iterating through elements or child-elements only. You find this outlined in Parse XML with PHP and XMLReader.
参见:
这篇关于在 php 中解析非常大的 XML 文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!