Hi there
I'm currently using Cobalt to just display data in a table. I have made my own filters and it all works pretty well (I'm new at website building). My problem comes when I want to actually put in data, where I use the CSV import function.
I'm importing into a section with 12 fields. Nothing major there. The data however will eventually become fairly large (maybe 100k to 500k articles), but I can's seem to import more than around 100 articles, before the server times out. I get an error like "Fatal error: Maximum execution time of 30 seconds exceeded in .../libraries/joomla/language/language.php on line 376". I can't access php.ini, as it is a commercial server. Any chance I can fix this by changing some settings in Cobalt? Or is it a "bug" that needs addressing? It must be possible for Cobalt to split the data into pieces it can process and then continue...
Another thing is the lack of being able to mass delete. I have my data in a local database and exports to csv. I would like to be able to delete all records and then import the new ones, as some data needs to be unpublished, which I dont want to do manually. Or is it possible to have a field you import from csv that determines if the record is published? Then I can just update data instead of deleting and inserting new.
Otherwise Cobalt seems great!
Thank you!
Ove
Hi there
I'm currently using Cobalt to just display data in a table. I have made my own filters and it all works pretty well (I'm new at website building). My problem comes when I want to actually put in data, where I use the CSV import function.
I'm importing into a section with 12 fields. Nothing major there. The data however will eventually become fairly large (maybe 100k to 500k articles), but I can's seem to import more than around 100 articles, before the server times out. I get an error like "Fatal error: Maximum execution time of 30 seconds exceeded in .../libraries/joomla/language/language.php on line 376". I can't access php.ini, as it is a commercial server. Any chance I can fix this by changing some settings in Cobalt? Or is it a "bug" that needs addressing? It must be possible for Cobalt to split the data into pieces it can process and then continue...
Another thing is the lack of being able to mass delete. I have my data in a local database and exports to csv. I would like to be able to delete all records and then import the new ones, as some data needs to be unpublished, which I dont want to do manually. Or is it possible to have a field you import from csv that determines if the record is published? Then I can just update data instead of deleting and inserting new.
Otherwise Cobalt seems great!
Thank you!
Ove