php - Will file() affect the performance for files approximately 2 MB in size? -
i have 1 alternative grab info text file, cannot utilize database store that. file function grab info beingness re-created everyday @ 00:00, it's not problem big. maximum 2 mb of size , maximum of 6,000 - 7,000 lines @ end of day. my concern grabs info , display on webpage can accessed lot of times (approximately 10,000 per day or less
) -- somehow overload server using file()
or little file should fine? please allow me know. taking time read question , perchance answer.
example lines .txt file:
1,42,16, 201,stackoverflow_user, 1, 6762160, 39799, 9817242, 6762160, 39884, 10010545,stackoverflow_user, 2, 1351147, 1165, 483259, 1351147, 1115, 241630, 0 1,46,27, 201,[stackoverflow_user | stackoverflow_userother], 1, 4078465, 286991, 1594830, 4078465, 287036, 1643156,stackoverflow_user, 2, 1357147, 1115, 241630, 1357147, 1065, 120815, 0
my function:
# read file array $lines = file('c:/path/to/file.txt', file_ignore_new_lines | file_skip_empty_lines); # flip our array on lastly lines of file first. $lines = array_reverse($lines); $n = 1; $wanted = 21; # or many lines want. $content = ''; foreach ($lines $l) { # treat info comma-separated values $arr = explode(",", $l); # if col 5 has multiple values, take first 1 if (preg_match("/\[(.+?) \|/", $arr[4], $matches)) { $arr[4] = $matches[1]; } # arr[4] same arr[12]? if ($arr[4] !== $arr[12]) { # these 2 not equal, utilize these values $data = array('rank-pos' => $n++, 'rank-name' => $arr[4], 'rank-dmuser' => $arr[12]); $content .= template::load('rankinguserdm-' . ($n % 2 == 1 ? 2 : 1), $data); } # have got plenty data? if ($n === $wanted) { break; } } $this->content = template::load('user_rankingsdm', array('rankings' => $content)); }
it depends on context. if don't expect that big amount of traffic, should ok. otherwise, find way store somewhere. database, ram, nuclear shelters - job, not read file scheme every time.
i 1 time had similar issue having read text logfiles (each 10 mbs) remote server few times per second. thought since there weren't that many users, take easy shortcut , them each time straight remote server. long story short, easy shortcut backfired when turned out server dos-ing remote.
i had store info on mysql database, while keeping track of each file's size , reading end of previous read cycle (in order avoid duplicates). each file requested remote not @ 1 time per minute. along few other tricks, remote server doing happily job, , server had fresh plenty info needs.
tl;dr: decide depending on how much load expect, amount of users , hardware got. if suspect @ point have huge loads, create favor , don't on-the-fly
cheers
php
No comments:
Post a Comment