Im using codeigniter to pull blog posts from my database and I want to display how much time has elapsed since the post is posted. By default codeigniters timespan() function is displaying for example:
1 Year, 10 Months, 2 Weeks, 5 Days, 10 Hours, 16 Minutes
but i want to translate the words Year,Months,Weeks,Days,Hours,Minutes in other language, for example Bosnian,Croatian or Serbian.
How can I translate it?
I found the answer. You need to go to the application/language/english folder and create a file called date_lang.php. In my case I just want to have one language on my website, so i made the file in the default (english) folder. If you have more languages you can change the language in the application/config/config.php and change the $config['language'] = 'english' to whatever language you made in the application/language/YOURLANGUAGE.php file.
I added this piece of code in my date_lang.php file and it works well.
<?php
$lang['date_year'] = 'Godinu';
$lang['date_years'] = 'Godina';
$lang['date_month'] = 'Mjesec';
$lang['date_months'] = 'Mjeseci';
$lang['date_week'] = 'Sedmica';
$lang['date_weeks'] = 'Sedmice';
$lang['date_day'] = 'Dan';
$lang['date_days'] = 'Dana';
$lang['date_hour'] = 'Sat';
$lang['date_hours'] = 'Sat';
$lang['date_minute'] = 'Minute';
$lang['date_minutes'] = 'Minuta';
$lang['date_second'] = 'Sekundu';
$lang['date_seconds'] = 'Sekunde';
?>
Related
I created an interactive dashboard using hvPlot .interactive and Panel:
template = pn.template.FastListTemplate(
title='Central Africa - Word Frequency Analysis',
sidebar=['Frequency' , yaxis],
main=[ihvplot.panel(), ipanel.panel()],
accent_base_color="#88d8b0",
header_background="#88d8b0",
sidebar_width=450,
theme='dark',
)
template.show()
Where ivplot and ipanel are my interactive plot and table, respectively. However, I would like to put instead of the table, an interactive wordcloud which I created using ipywidgets:
list1 = ['Corbeau News','Journal de Bangui','Le Potentiel','Ndjoni Sango','RJDH','Radio
Lengo Songo','Radio Ndeke Luka']
def makingclouds(Category,frame,col,atitle):
wordcloud_bangui = WordCloud(stopwords= stopword , width=1600 , height=800 ,
background_color="black",
colormap="Set2").generate(''.join(data_file_1['Content']))
plt.figure(figsize=(20,10),facecolor='k')
plt.title(atitle, fontsize=40)#,fontweight="bold")
plt.imshow(wordcloud_bangui, interpolation="bilinear")
plt.axis("off")
plt.tight_layout(pad=0)
wordcloud = interact(makingclouds, Category=list1, df=fixed(data_file_1),
col=fixed('Content'),
atitle=fixed('Most used words media - Central Africa'),
frame=fixed(data_file_1[['Source','Content']]))
My question is how can I do it. I've tried simply to put 'wordcloud' instead of itable in the 4th line of the first piece of code but it tells me that the object does not have a function panel.
What can I do?
I'm using Google Ads API v11 to upload conversions and adjust conversions.
I send hundreds of conversions each day and want to start sending batch requests instead.
I've followed Google's documentation and I upload/ adjust conversions exactly the way they stated.
https://developers.google.com/google-ads/api/docs/conversions/upload-clicks
https://developers.google.com/google-ads/api/docs/conversions/upload-adjustments
I could not find any good explanation or example on how to send batch requests:
https://developers.google.com/google-ads/api/reference/rpc/v11/BatchJobService
Below is my code, an example of how I adjust hundreds of conversions.
An explanation of how to do so with batch requests would be very appreciated.
# Adjust the conversion value of an existing conversion, via Google Ads API
def adjust_offline_conversion(
client,
customer_id,
conversion_action_id,
gclid,
conversion_date_time,
adjustment_date_time,
restatement_value,
adjustment_type='RESTATEMENT'):
# Check that gclid is valid string else exit the function
if type(gclid) is not str:
return None
# Check if datetime or string, if string make as datetime
if type(conversion_date_time) is str:
conversion_date_time = datetime.strptime(conversion_date_time, '%Y-%m-%d %H:%M:%S')
# Add 1 day forward to conversion time to avoid this error (as explained by Google: "The Offline Conversion cannot happen before the ad click. Add 1-2 days to your conversion time in your upload, or check that the time zone is properly set.")
to_datetime_plus_one = conversion_date_time + timedelta(days=1)
# If time is bigger than now, set as now (it will be enough to avoid the original google error, but to avoid a new error since google does not support future dates that are bigger than now)
to_datetime_plus_one = to_datetime_plus_one if to_datetime_plus_one < datetime.utcnow() else datetime.utcnow()
# We must convert datetime back to string + add time zone suffix (+00:00 or -00:00 this is utc) **in order to work with google ads api**
adjusted_string_date = to_datetime_plus_one.strftime('%Y-%m-%d %H:%M:%S') + "+00:00"
conversion_adjustment_type_enum = client.enums.ConversionAdjustmentTypeEnum
# Determine the adjustment type.
conversion_adjustment_type = conversion_adjustment_type_enum[adjustment_type].value
# Associates conversion adjustments with the existing conversion action.
# The GCLID should have been uploaded before with a conversion
conversion_adjustment = client.get_type("ConversionAdjustment")
conversion_action_service = client.get_service("ConversionActionService")
conversion_adjustment.conversion_action = (
conversion_action_service.conversion_action_path(
customer_id, conversion_action_id
)
)
conversion_adjustment.adjustment_type = conversion_adjustment_type
conversion_adjustment.adjustment_date_time = adjustment_date_time.strftime('%Y-%m-%d %H:%M:%S') + "+00:00"
# Set the Gclid Date
conversion_adjustment.gclid_date_time_pair.gclid = gclid
conversion_adjustment.gclid_date_time_pair.conversion_date_time = adjusted_string_date
# Sets adjusted value for adjustment type RESTATEMENT.
if conversion_adjustment_type == conversion_adjustment_type_enum.RESTATEMENT.value:
conversion_adjustment.restatement_value.adjusted_value = float(restatement_value)
conversion_adjustment_upload_service = client.get_service("ConversionAdjustmentUploadService")
request = client.get_type("UploadConversionAdjustmentsRequest")
request.customer_id = customer_id
request.conversion_adjustments = [conversion_adjustment]
request.partial_failure = True
response = (
conversion_adjustment_upload_service.upload_conversion_adjustments(
request=request,
)
)
conversion_adjustment_result = response.results[0]
print(
f"Uploaded conversion that occurred at "
f'"{conversion_adjustment_result.adjustment_date_time}" '
f"from Gclid "
f'"{conversion_adjustment_result.gclid_date_time_pair.gclid}"'
f' to "{conversion_adjustment_result.conversion_action}"'
)
# Iterate every row (subscriber) and call the "adjust conversion" function for it
df.apply(lambda row: adjust_offline_conversion(client=client
, customer_id=customer_id
, conversion_action_id='xxxxxxx'
, gclid=row['click_id']
, conversion_date_time=row['subscription_time']
, adjustment_date_time=datetime.utcnow()
, restatement_value=row['revenue'])
, axis=1)
I managed to solve it in the following way:
The conversion upload and adjustment are not supported in the Batch Processing, as they are not listed here.
However, it is possible to upload multiple conversions in one request since the conversions[] field (list) could be populated with several conversions, not only a single conversion as I mistakenly thought.
So if you're uploading conversions/ adjusting conversions you can simply upload them in batch this way:
Instead of uploading one conversion:
request.conversions = [conversion]
Upload several:
request.conversions = [conversion_1, conversion_2, conversion_3...]
Going the same way for conversions adjustment upload:
request.conversion_adjustments = [conversion_adjustment_1, conversion_adjustment_2, conversion_adjustment_3...]
I am trying to download public comments and replies from the FACEBOOK public post by page.
my code is working until 5 Feb'18, Now it is showing below error for the "Replies".
Error in data.frame(from_id = json$from$id, from_name = json$from$name, :
arguments imply differing number of rows: 0, 1
Called from: data.frame(from_id = json$from$id, from_name = json$from$name,
message = ifelse(!is.null(json$message), json$message, NA),
created_time = json$created_time, likes_count = json$like_count,
comments_count = json$comment_count, id = json$id, stringsAsFactors = F)
please refer below code I am using.
data_fun=function(II,JJ,page,my_oauth){
test <- list()
test.reply<- list()
for (i in II:length(page$id)){
test[[i]] <- getPost(post=page$id[i], token = my_oauth,n= 100000, comments = TRUE, likes = FALSE)
if (nrow(test[[i]][["comments"]]) > 0) {
write.csv(test[[i]], file = paste0(page$from_name[2],"_comments_", i, ".csv"), row.names = F)
for (j in JJ:length(test[[i]]$comments$id)){
test.reply[[j]] <-getCommentReplies(comment_id=test[[i]]$comments$id[j],token=my_oauth,n = 100000, replies = TRUE,likes = FALSE)
if (nrow(test.reply[[j]][["replies"]]) > 0) {
write.csv(test.reply[[j]], file = paste0(page$from_name[2],"_replies_",i,"_and_", j, ".csv"), row.names = F)
}}}
}
Sys.sleep(10)}
Thanks For Your support In advance.
I had the very same problem as Facebook changed the api rules at the end of January. If you update your package with 'devtools' from Pablo Barbera's github, it should work for you.
I have amended my code (a little) and it works fine now for replies to comments.There is one frustrating thing though, is that Facebook dont appear to allow one to extract the user name. I have a pool of data already so I am now using that to train and predict gender.
If you have any questions and want to make contact - drop me an email at 'robert.chestnutt2#mail.dcu.ie'
By the way - it may not be an issue for you, but I have had challenges in the past writing the Rfacebook output to a csv. Saving output as an .RData file maintains the form a lot better
Tried to use stdWrap.cache on different instances (TYPO3 7.6.23 and 8.7.8) according to https://docs.typo3.org/typo3cms/TyposcriptReference/7.6/Functions/Cache/ But the content is rendered for each page instead of sharing it with other pages.
Also the exact example doesn't work:
page = PAGE
page.5 = TEXT
page.5 {
stdWrap.cache.key = mycurrenttimestamp
stdWrap.cache.tags = tag_a,tag_b,tag_c
stdWrap.cache.lifetime = 3600
stdWrap.data = date : U
stdWrap.strftime = %H:%M:%S
}
Can anybody confirm this? Or does anybody currently have a working usecase?
It had been a documentation thing.
Caching has changed a little bit. You should use cache directly on your object - without stdWrap:
page = PAGE
page.5 = TEXT
page.5 {
cache.key = mycurrenttimestamp
cache.tags = tag_a,tag_b,tag_c
cache.lifetime = 3600
data = date : U
strftime = %H:%M:%S
}
See https://forge.typo3.org/issues/82828
I'm using lastLogonTimeStamp to track the users last logon time as the following code:
$Domain = [System.DirectoryServices.ActiveDirectory.Domain]::GetCurrentDomain()
$ADSearch = New-Object System.DirectoryServices.DirectorySearcher
$ADSearch.SearchRoot ="LDAP://$Domain"
$ADSearch.SearchScope = "subtree"
$ADSearch.PageSize = 100
$ADSearch.Filter = "(objectClass=user)"
$properies = #("distinguishedName",
"sAMAccountName",
"mail",
"lastLogonTimeStamp")
foreach ($pro in $properies) {
$ADSearch.PropertiesToLoad.add($pro)
}
$userObjects = $ADSearch.FindAll()
foreach ($user in $userObjects) {
$logon = $user.Properties.Item("lastLogonTimeStamp")[0]
$lastLogon = [datetime]::fromfiletime($logon)
$lastLogon= $lastLogon.ToString("yyyy/MM/dd")
$lastLogon
}
I've gotten so far:
1601/01/01
1601/01/01
3/12/2012
1601/01/01
3/19/2015
This is not the first time I'm bloody confused about the 1601/01/01 value. And I've read also the MS document about this value and for me it's nonsense, it does not describe much what is the purposes of it. Not only lastLogonTimeStamp has this output, many other attributes have return this as well. So my questions are:
What is the purpose of this value?
In this case, what should I return as a proper human readable output ? (This attribute is not valid for this user?)
There is a known bug with the "last logon timestamp" and Windows 2016 domain controllers.
LDAP simple bind are not updating the last logon timestamp like previous OS ( 2012, 2008 ). Be careful.
I spent 2 months with MS on this. A patch will be released eventually... but for now it's not fixed.