我正在努力从网络服务器收集 PHP 脚本的自动数据.相关文件包含气象数据,每 10 分钟更新一次.奇怪的是,网络服务器上的文件修改"日期没有改变.
I'm struggling with the automated data collection of a PHP script from a webserver. The files in question contain meteo data and are updated every 10 minutes. Weirdly enough, the 'file modified' date on the webserver doesn't change.
一个简单的fopen('http://...')-命令试图获取每小时此目录中最后一个文件的最新版本.但通常我最终会得到一个长达 4 小时的版本.这发生在 Linux 服务器上(正如我的系统管理员向我保证的那样)不使用任何类型的代理服务器.
A simple fopen('http://...')-command tries to get the freshest version of the last file in this directory every hour. But regularly I end up with a version up to 4 hours old. This happens on a Linux server which (As my system administrator has assured me) doesn't use a proxy server of any kind.
PHP 是否实现了自己的缓存机制?或者还有什么可能会干扰这里?
Does PHP implement its own caching mechanism? Or what else could be interfering here?
(我目前的解决方法是通过有效的 exec('wget --nocache...') 获取文件.)
(My current workaround is to grab the file via exec('wget --nocache...') which works.)
由于您通过 HTTP 获取文件,因此我假设 PHP 将遵守服务器响应的任何缓存标头.
Since you're getting the file via HTTP, I'm assuming that PHP will be honouring any cache headers the server is responding with.
避免这种情况的一种非常简单和肮脏的方法是向每个请求附加一些随机获取参数.
A very simple and dirty way to avoid that is to append some random get parameter to each request.
这篇关于PHPs fopen 函数是否实现了某种缓存?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!