Choose Class in Birt is empty eventhough I have added jar in Datasource - birt

Even though while creating dataset choose class window is empty. I am using Luna Service Release 2 (4.4.2).

From: http://yaragalla.blogspot.com/2013/10/using-pojo-datasource-in-birt-43.html
In the dataset class the three methods, “public void open(Object obj, Map map)”, “public Object next()” and “public void close()” must be implemented.
Make sure you have implemented these.
Here is a sample that I tested with:
public class UserDataSet {
public Iterator<User> itr;
public List<User> getUsers() throws ParseException {
List<User> users = new ArrayList<>();
// Add to Users
....
return users;
}
public void open(Object obj, Map<String, Object> map) {
try {
itr = getUsers().iterator();
} catch (ParseException e) {
e.printStackTrace();
}
}
public Object next() {
if (itr.hasNext())
return itr.next();
return null;
}
public void close() {
}
}

Related

trino udf how to create a aggregate function for the window function

I tried to write a udf function to calculate my data. In the trino's docs, I knew I should to write a function plugin and I succeed to execute my udf aggregate function sql.
But when I write sql with aggregate function and window function, the sql executed failed.
The error log is com.google.common.util.concurrent.ExecutionError: java.lang.NoClassDefFoundError: com/example/ListState.
I think I may implement the interface about the window function.
The ListState.java file code
#AccumulatorStateMetadata(stateSerializerClass = ListStateSerializer.class, stateFactoryClass = ListStateFactory.class)
public interface ListState extends AccumulatorState {
List<String> getList();
void setList(List<String> value);
}
The ListStateSerializer file code
public class ListStateSerializer implements AccumulatorStateSerializer<ListState>
{
#Override
public Type getSerializedType() {
return VARCHAR;
}
#Override
public void serialize(ListState state, BlockBuilder out) {
if (state.getList() == null) {
out.appendNull();
return;
}
String value = String.join(",", state.getList());
VARCHAR.writeSlice(out, Slices.utf8Slice(value));
}
#Override
public void deserialize(Block block, int index, ListState state) {
String value = VARCHAR.getSlice(block, index).toStringUtf8();
List<String> list = Arrays.asList(value.split(","));
state.setList(list);
}
}
The ListStateFactory file code
public class ListStateFactory implements AccumulatorStateFactory<ListState> {
public static final class SingleListState implements ListState {
private List<String> list = new ArrayList<>();
#Override
public List<String> getList() {
return list;
}
#Override
public void setList(List<String> value) {
list = value;
}
#Override
public long getEstimatedSize() {
if (list == null) {
return 0;
}
return list.size();
}
}
public static class GroupedListState implements GroupedAccumulatorState, ListState {
private final ObjectBigArray<List<String>> container = new ObjectBigArray<>();
private long groupId;
#Override
public List<String> getList() {
return container.get(groupId);
}
#Override
public void setList(List<String> value) {
container.set(groupId, value);
}
#Override
public void setGroupId(long groupId) {
this.groupId = groupId;
if (this.getList() == null) {
this.setList(new ArrayList<String>());
}
}
#Override
public void ensureCapacity(long size) {
container.ensureCapacity(size);
}
#Override
public long getEstimatedSize() {
return container.sizeOf();
}
}
#Override
public ListState createSingleState() {
return new SingleListState();
}
#Override
public ListState createGroupedState() {
return new GroupedListState();
}
}
Thanks for help!!!!
And I found the WindowAccumulator class in the trino source code. But I don't know how to use it.
How to create a aggregate function for window function?

Repository returning null when autowiring in Kinesis KCL Consumer

I'm currently working with AWS Kinesis using the KCL Library. I can consume the records and print them with println. But when I try to call a repository class, it returns null (I'm autowiring it).
RecordProcessor
public class ScoreRecordProcessor implements ShardRecordProcessor {
private String shardId;
#Autowired
private ConstatacoesRepository repo;
#Override
public void initialize(InitializationInput initializationInput) {
shardId = initializationInput.shardId();
System.out.println(String.format("Inicializando leitura na shard %s # sequence: %s", shardId,
initializationInput.extendedSequenceNumber().toString()));
}
#Override
public void processRecords(ProcessRecordsInput processRecordsInput) {
ObjectMapper mapper = new ObjectMapper();
for (KinesisClientRecord record : processRecordsInput.records()) {
byte[] byteArr = new byte[record.data().remaining()];
record.data().get(byteArr);
System.out.println("Constatacao recebida -> " + new String(byteArr));
try {
ResponseScoreDTO score = mapper.readValue(new String(byteArr), ResponseScoreDTO.class);
for(Constatacao constatacao : score.getConstatacao()) {
Constatacoes entidadeBanco = new Constatacoes();
entidadeBanco.setArea(constatacao.getArea());
entidadeBanco.setConstatacaoNotaFiscal(constatacao.getConstatacao());
entidadeBanco.setCriticidade(constatacao.getCriticidade());
entidadeBanco.setEfetivaEscrituracao(constatacao.getEfetivaEscReg());
entidadeBanco.setEscopo(constatacao.getEscopo());
entidadeBanco.setIdSolicitacaoNotaFiscal(BigInteger.valueOf(Long.valueOf(score.getIdTransacao())));
entidadeBanco.setTxtConstatacao(null);
repo.save(entidadeBanco);
System.out.println("Entidade salva com sucesso.");
}
} catch (JsonProcessingException e) {
e.printStackTrace();
}
}
}
#Override
public void leaseLost(LeaseLostInput leaseLostInput) {
}
#Override
public void shardEnded(ShardEndedInput shardEndedInput) {
System.out.println(String.format("Shard %s chegou ao fim.", shardId));
}
#Override
public void shutdownRequested(ShutdownRequestedInput shutdownRequestedInput) {
}
}
RecordProcessorFactory
public class ScoreRecordProcessorFactory implements ShardRecordProcessorFactory {
#Override
public ShardRecordProcessor shardRecordProcessor() {
return new ScoreRecordProcessor();
}
}
Repository
#Repository
public interface ConstatacoesRepository extends JpaRepository<Constatacoes,BigInteger>{
}
Print of the console

Spring Mvc with Thread

Hi My thread class is showing null pointer exception please help me to resolve
#Component
public class AlertsToProfile extends Thread {
public final Map<Integer, List<String>> userMessages = Collections.synchronizedMap(new HashMap<Integer, List<String>>());
#Autowired
ProfileDAO profileDAO;
private String categoryType;
private String dataMessage;
public String getCategoryType() {
return categoryType;
}
public void setCategoryType(String categoryType) {
this.categoryType = categoryType;
}
public String getDataMessage() {
return dataMessage;
}
public void setDataMessage(String dataMessage) {
this.dataMessage = dataMessage;
}
public void run() {
String category=getCategoryType();
String data= getDataMessage();
List<Profile> all = profileDAO.findAll();
if (all != null) {
if (category == "All" || category.equalsIgnoreCase("All")) {
for (Profile profile : all) {
List<String> list = userMessages.get(profile.getId());
if (list == null ) {
ArrayList<String> strings = new ArrayList<String>();
strings.add(data);
userMessages.put(profile.getId(), strings);
} else {
list.add(data);
}
}
}
}
}
and my service method is as follows
#Service
public class NoteManager
{
#Autowired AlertsToProfile alertsToProfile;
public void addNote(String type, String message, String category) {
alertsToProfile.setCategoryType(category);
String data = type + "," + message;
alertsToProfile.setDataMessage(data);
alertsToProfile.start();
System.out.println("addNotes is done");
}
But when i call start() method am getting null pointer exception please help me. I am new to spring with thread concept
It pretty obvious: you instantiate your thread directly, as opposed to letting spring create AlertsToProfile and auto wire your instance.
To fix this, create a Runnable around your run() method and embed that into a method, something like this:
public void startThread() {
new Thread(new Runnable() {
#Override
public void run() {
// your code in here
}}).start();
}
you will want to bind the Thread instance to a field in AlertsToProfile in order to avoid leaks and stop the thread when you're done.

Custom WritableCompare displays object reference as output

I am new to Hadoop and Java, and I feel there is something obvious I am just missing. I am using Hadoop 1.0.3 if that means anything.
My goal for using hadoop is to take a bunch of files and parse them one file at a time (as opposed to line by line). Each file will produce multiple key-values, but context to the other lines is important. The key and value are multi-value/composite, so I have implemented WritableCompare for the key and Writable for the value. Because the processing of each file take a bit of CPU, I want to save the output of the mapper, then run multiple reducers later on.
For the composite keys, I followed [http://stackoverflow.com/questions/12427090/hadoop-composite-key][1]
The problem is, the output is just Java object references as opposed to the composite key and value. Example:
LinkKeyWritable#bd2f9730 LinkValueWritable#8752408c
I am not sure if the problem is related to not reducing the data at all or
Here is my main class:
public static void main(String[] args) throws Exception {
JobConf conf = new JobConf(Parser.class);
conf.setJobName("raw_parser");
conf.setOutputKeyClass(LinkKeyWritable.class);
conf.setOutputValueClass(LinkValueWritable.class);
conf.setMapperClass(RawMap.class);
conf.setNumMapTasks(0);
conf.setInputFormat(PerFileInputFormat.class);
conf.setOutputFormat(TextOutputFormat.class);
PerFileInputFormat.setInputPaths(conf, new Path(args[0]));
FileOutputFormat.setOutputPath(conf, new Path(args[1]));
JobClient.runJob(conf);
}
And my Mapper class:
public class RawMap extends MapReduceBase implements
Mapper {
public void map(NullWritable key, Text value,
OutputCollector<LinkKeyWritable, LinkValueWritable> output,
Reporter reporter) throws IOException {
String json = value.toString();
SerpyReader reader = new SerpyReader(json);
GoogleParser parser = new GoogleParser(reader);
for (String page : reader.getPages()) {
String content = reader.readPageContent(page);
parser.addPage(content);
}
for (Link link : parser.getLinks()) {
LinkKeyWritable linkKey = new LinkKeyWritable(link);
LinkValueWritable linkValue = new LinkValueWritable(link);
output.collect(linkKey, linkValue);
}
}
}
Link is basically a struct of various information that get's split between LinkKeyWritable and LinkValueWritable
LinkKeyWritable:
public class LinkKeyWritable implements WritableComparable<LinkKeyWritable>{
protected Link link;
public LinkKeyWritable() {
super();
link = new Link();
}
public LinkKeyWritable(Link link) {
super();
this.link = link;
}
#Override
public void readFields(DataInput in) throws IOException {
link.batchDay = in.readLong();
link.source = in.readUTF();
link.domain = in.readUTF();
link.path = in.readUTF();
}
#Override
public void write(DataOutput out) throws IOException {
out.writeLong(link.batchDay);
out.writeUTF(link.source);
out.writeUTF(link.domain);
out.writeUTF(link.path);
}
#Override
public int compareTo(LinkKeyWritable o) {
return ComparisonChain.start().
compare(link.batchDay, o.link.batchDay).
compare(link.domain, o.link.domain).
compare(link.path, o.link.path).
result();
}
#Override
public int hashCode() {
return Objects.hashCode(link.batchDay, link.source, link.domain, link.path);
}
#Override
public boolean equals(final Object obj){
if(obj instanceof LinkKeyWritable) {
final LinkKeyWritable o = (LinkKeyWritable)obj;
return Objects.equal(link.batchDay, o.link.batchDay)
&& Objects.equal(link.source, o.link.source)
&& Objects.equal(link.domain, o.link.domain)
&& Objects.equal(link.path, o.link.path);
}
return false;
}
}
LinkValueWritable:
public class LinkValueWritable implements Writable{
protected Link link;
public LinkValueWritable() {
link = new Link();
}
public LinkValueWritable(Link link) {
this.link = new Link();
this.link.type = link.type;
this.link.description = link.description;
}
#Override
public void readFields(DataInput in) throws IOException {
link.type = in.readUTF();
link.description = in.readUTF();
}
#Override
public void write(DataOutput out) throws IOException {
out.writeUTF(link.type);
out.writeUTF(link.description);
}
#Override
public int hashCode() {
return Objects.hashCode(link.type, link.description);
}
#Override
public boolean equals(final Object obj){
if(obj instanceof LinkKeyWritable) {
final LinkKeyWritable o = (LinkKeyWritable)obj;
return Objects.equal(link.type, o.link.type)
&& Objects.equal(link.description, o.link.description);
}
return false;
}
}
I think the answer is in the implementation of the TextOutputFormat. Specifically, the LineRecordWriter's writeObject method:
/**
* Write the object to the byte stream, handling Text as a special
* case.
* #param o the object to print
* #throws IOException if the write throws, we pass it on
*/
private void writeObject(Object o) throws IOException {
if (o instanceof Text) {
Text to = (Text) o;
out.write(to.getBytes(), 0, to.getLength());
} else {
out.write(o.toString().getBytes(utf8));
}
}
As you can see, if your key or value is not a Text object, it calls the toString method on it and writes that out. Since you've left toString unimplemented in your key and value, it's using the Object class's implementation, which is writing out the reference.
I'd say that you should try writing an appropriate toString function or using a different OutputFormat.
It looks like you have a list of objects just like you wanted. You need to implement toString() on your writable if you want a human-readable version printed out instead of an ugly java reference.

Entity Framework throwing error when updating records

Store update, insert, or delete statement affected an unexpected number of rows (0). Entities may have been modified or deleted since entities were loaded. Refresh ObjectStateManager entries.
I get this error, but I am the only person using the database. I am using Entity Framework 4.1 with DBContext.
I am updating my records and SQL Profiler is showing a queue being sent in. What could be the causes of this issue?
The post:
[HttpPost]
public ActionResult EditUser(User user)
{
uow.UserRepository.Update(user);
uow.Save();
return RedirectToAction("Index", "User");
}
On this call:
public void Save()
{
_context.SaveChanges();
}
This is how it is attached
public virtual void Update(TEntity entityToUpdate)
{
dbSet.Attach(entityToUpdate);
context.Entry(entityToUpdate).State = EntityState.Modified;
}
Update:
public class UnitOfWork : IDisposable
{
private StudentSchedulingEntities _context = new StudentSchedulingEntities();
private GenericRepository<User> userRepository;
private GenericRepository<UserRole> userRoleRepository;
private bool disposed = false;
public GenericRepository<User> UserRepository
{
get
{
if (this.userRepository == null)
{
this.userRepository = new GenericRepository<User>(_context);
}
return userRepository;
}
}
public GenericRepository<UserRole> UserRoleRepository
{
get
{
if (this.userRoleRepository == null)
{
this.userRoleRepository = new GenericRepository<UserRole>(_context);
}
return userRoleRepository;
}
}
public void Save()
{
_context.SaveChanges();
}
protected virtual void Dispose(bool disposing)
{
if (!this.disposed)
{
if (disposing)
{
_context.Dispose();
}
}
this.disposed = true;
}
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
}
The ID field must be there in order to update the information properly. Otherwise, it will be throw a null. (I forgot to put the hidden field for ID in)

Resources